12
Linear algebra 1 Linear algebra The three-dimensional Euclidean space R 3 is a vector space, and lines and planes passing through the origin are vector subspaces in R 3 . Linear algebra is the branch of mathematics concerning finite or countably infinite dimensional vector spaces, as well as linear mappings between such spaces. Such an investigation is initially motivated by a system of linear equations in several unknowns. Such equations are naturally represented using the formalism of matrices and vectors. [1] Linear algebra is central to both pure and applied mathematics. For instance abstract algebra arises by relaxing the axioms leading to a number of generalizations. Functional analysis studies the infinite-dimensional version of the theory of vector spaces. Combined with calculus linear algebra facilitates the solution of linear systems of differential equations. Techniques from linear algebra are also used in analytic geometry, engineering, physics, natural sciences, computer science, and the social sciences (particularly in economics). Because linear algebra is such a well-developed theory, nonlinear mathematical models are sometimes approximated by linear ones. History The study of linear algebra and matrices first emerged from the study of determinants, which were used to solve systems of linear equations. Determinants were used by Leibniz in 1693, and subsequently, Cramer devised the Cramer's Rule for solving linear systems in 1750. Later, Gauss further developed the theory of solving linear systems by using Gaussian elimination, which was initially listed as an advancement in geodesy. [2] The study of matrix algebra first emerged in England in the mid 1800s. In 1848, Sylvester introduced the term matrix, which is Latin for "womb". While studying compositions linear transformations, Arthur Cayley was led to define matrix multiplication and inverses. Crucially, Cayley used a single letter to denote a matrix, thus thinking of matrices as an aggregate object. He also realized the connection between matrices and determinants and wrote that "There would be many things to say about this theory of matrices which should, it seems to me, precede the theory of determinants". [3] The first modern and more precise definition of a vector space was introduced by Peano in 1888; [4] by 1900, a theory of linear transformations of finite-dimensional vector spaces had emerged. Linear algebra first took its modern form in the first half of the twentieth century, when many ideas and methods of previous centuries were generalized as abstract algebra. The use of matrices in quantum mechanics, special relativity, and statistics helped spread the subject of linear algebra beyond pure mathematics. The development of computers led to increased research in efficient algorithms for Gaussian elimination and matrix decompositions, and linear algebra became an essential tool for modelling and simulations. [5] The origin of many of these ideas is discussed in the articles on determinants and Gaussian elimination.

Linear Algebra

Embed Size (px)

Citation preview

Page 1: Linear Algebra

Linear algebra 1

Linear algebra

The three-dimensional Euclidean space R3 is a vector space, andlines and planes passing through the origin are vector subspaces in

R3.

Linear algebra is the branch of mathematicsconcerning finite or countably infinite dimensionalvector spaces, as well as linear mappings between suchspaces. Such an investigation is initially motivated by asystem of linear equations in several unknowns. Suchequations are naturally represented using the formalismof matrices and vectors.[1]

Linear algebra is central to both pure and appliedmathematics. For instance abstract algebra arises byrelaxing the axioms leading to a number ofgeneralizations. Functional analysis studies theinfinite-dimensional version of the theory of vectorspaces. Combined with calculus linear algebrafacilitates the solution of linear systems of differentialequations. Techniques from linear algebra are also usedin analytic geometry, engineering, physics, natural sciences, computer science, and the social sciences (particularlyin economics). Because linear algebra is such a well-developed theory, nonlinear mathematical models aresometimes approximated by linear ones.

HistoryThe study of linear algebra and matrices first emerged from the study of determinants, which were used to solvesystems of linear equations. Determinants were used by Leibniz in 1693, and subsequently, Cramer devised theCramer's Rule for solving linear systems in 1750. Later, Gauss further developed the theory of solving linear systemsby using Gaussian elimination, which was initially listed as an advancement in geodesy. [2]

The study of matrix algebra first emerged in England in the mid 1800s. In 1848, Sylvester introduced the termmatrix, which is Latin for "womb". While studying compositions linear transformations, Arthur Cayley was led todefine matrix multiplication and inverses. Crucially, Cayley used a single letter to denote a matrix, thus thinking ofmatrices as an aggregate object. He also realized the connection between matrices and determinants and wrote that"There would be many things to say about this theory of matrices which should, it seems to me, precede the theory ofdeterminants".[3]

The first modern and more precise definition of a vector space was introduced by Peano in 1888;[4] by 1900, a theoryof linear transformations of finite-dimensional vector spaces had emerged. Linear algebra first took its modern formin the first half of the twentieth century, when many ideas and methods of previous centuries were generalized asabstract algebra. The use of matrices in quantum mechanics, special relativity, and statistics helped spread thesubject of linear algebra beyond pure mathematics. The development of computers led to increased research inefficient algorithms for Gaussian elimination and matrix decompositions, and linear algebra became an essential toolfor modelling and simulations.[5]

The origin of many of these ideas is discussed in the articles on determinants and Gaussian elimination.

Page 2: Linear Algebra

Linear algebra 2

Scope of study

Vector spacesThe main structures of linear algebra are vector spaces. A vector space over a field F is a set V together with twobinary operations. Elements of V are called vectors and elements of F are called scalars. The first operation, vectoraddition, takes any two vectors v and w and outputs a third vector v + w. The second operation takes any scalar a andany vector v and outputs a new vector vector av. In view of the first example, where the multiplication is done byrescaling the vector v by a scalar a, the multiplication is called scalar multiplication of v by a. The operations ofaddition and multiplication in a vector space satisfies following axioms.[6] In the list below, let u, v and w bearbitrary vectors in V, and a and b scalars in F.

Axiom Signification

Associativity of addition u + (v + w) = (u + v) + w

Commutativity of addition u + v = v + u

Identity element of addition There exists an element 0 ∈ V, called the zero vector, such that v + 0 = v for all v ∈ V.

Inverse elements of addition For every v ∈ V, there exists an element −v ∈ V, called the additive inverse of v, suchthat v + (−v) = 0

Distributivity of scalar multiplication with respect tovector addition

a(u + v) = au + av

Distributivity of scalar multiplication with respect to fieldaddition

(a + b)v = av + bv

Compatibility of scalar multiplication with fieldmultiplication

a(bv) = (ab)v [7]

Identity element of scalar multiplication 1v = v, where 1 denotes the multiplicative identity in F.

Elements of a general vector space V may be objects of any nature, for example, functions, polynomials, vectors, ormatrices. Linear algebra is concerned with properties common to all vector spaces.

Linear transformationsSimilarly as in the theory of other algebraic structures, linear algebra studies mappings between vector spaces thatpreserve the vector-space structure. Given two vector spaces V and W over a field F, a linear transformation (alsocalled linear map, linear mapping or linear operator) is a map

that is compatible with addition and scalar multiplication:

for any vectors u,v ∈ V and a scalar a ∈ F. When a bijective linear mapping exists between two vector spaces (that is,every vector from the first space is associated with one in the second), we say that the two spaces are isomorphic.Because an isomorphism preserves linear structure, two isomorphic vector spaces are "essentially the same" from thelinear algebra point of view. One essential question in linear algebra is whether a mapping is an isomorphism or not,and this question can be answered by checking if the determinant is nonzero. If a mapping is not an isomorphism,linear algebra is interested in finding its range (or image) and the set of elements that get mapped to zero, called thekernel of the mapping.

Page 3: Linear Algebra

Linear algebra 3

Subspaces, span, and basisAgain in analogue with theories of other algebraic objects, linear algebra is interested in subsets of vector spaces thatare vector spaces themselves; these subsets are called linear subspaces. For instance, the range and kernel of a linearmapping are both subspaces, and are thus often called the range space and the nullspace; these are importantexamples of subspaces. Another important way of forming a subspace is to take a linear combination of a set ofvectors v1, v2, …, vk:

where a1, a2, …, ak are scalars. The set of all linear combinations of vectors v1, v2, …, vk is called their span, whichforms a subspace.A linear combination of any system of vectors with all zero coefficients is the zero vector of V. If this is the only wayto express zero vector as a linear combination of v1, v2, …, vk then these vectors are linearly independent. Given a setof vectors that span a space, if any vector is a linear combination of other vectors (and so the set is not linearlyindependent), then the span would remain the same if we remove from the set. Thus, a set of linearly dependentvectors is redundant in the sense that a linearly independent subset will span the same subspace. Therefore, we aremostly interested in a linearly independent set of vectors that spans a vector space V, which we call a basis of V. Anyset of vectors that spans V contains a basis, and any linearly independent set of vectors in V can be extended to abasis.[8] It turns out that if we accept the axiom of choice, every vector space has a basis;[9] nevertheless, this basismay be unnatural, and indeed, may not even be constructable. For instance, there exists a basis for the real numbersconsidered as a vector space over the rationals, but no explicit basis has been constructed.Any two bases of a vector space V have the same cardinality, which is called the dimension of V. The dimension of avector space is well-defined by the dimension theorem for vector spaces. If a basis of V has finite number ofelements, V is called a finite-dimensional vector space. If V is finite-dimensional and U is a subspace of V, then dimU ≤ dim V. If U1 and U2 are subspaces of V, then

.[10]

One often restricts consideration to finite-dimensional vector spaces. A fundamental theorem of linear algebra statesthat all vector spaces of the same dimension are isomorphic,[11] giving an easy way of characterizing isomorphism.

Vectors as n-tuples: matrix theoryA particular basis {v1, v2, …, vn} of V allows one to construct a coordinate system in V: the vector with coordinates(a1, a2, …, an) is the linear combination

The condition that v1, v2, …, vn span V guarantees that each vector v can be assigned coordinates, whereas the linearindependence of v1, v2, …, vn assures that these coordinates are unique (i.e. there is only one linear combination ofthe basis vectors that is equal to v). In this way, once a basis of a vector space V over F has been chosen, V may beidentified with the coordinate n-space Fn. Under this identification, addition and scalar multiplication of vectors in Vcorrespond to addition and scalar multiplication of their coordinate vectors in Fn. Furthermore, if V and W are ann-dimensional and m-dimensional vector space over F, and a basis of V and a basis of W have been fixed, then anylinear transformation T: V → W may be encoded by an m × n matrix A with entries in the field F, called the matrix ofT with respect to these bases. Two matrices that encode the same linear transformation in different bases are calledsimilar. Matrix theory replaces the study of linear transformations, which were defined axiomatically, by the study ofmatrices, which are concrete objects. This major technique distinguishes linear algebra from theories of otheralgebraic structures, which usually cannot be parametrized so concretely.There is an important distinction between the coordinate n-space Rn and a general finite-dimensional vector space V.While Rn has a standard basis {e1, e2, …, en}, a vector space V typically does not come equipped with a basis andmany different bases exist (although they all consist of the same number of elements equal to the dimension of V).

Page 4: Linear Algebra

Linear algebra 4

One major application of the matrix theory is calculation of determinants, a central concept in linear algebra. Whiledeterminants could be defined in a basis-free manner, they are usually introduced via a specific representation of themapping; the value of the determinant does not depend on the specific basis. It turns out that a mapping is invertibleif and only if the determinant is nonzero. If the determinant is zero, then the nullspace is nontrivial. Determinantshave other applications, including a systematic way of seeing if a set of vectors is linearly independent (we write thevectors as the columns of a matrix, and if the determinant of that matrix is zero, the vectors are linearly dependent).Determinants could also be used to solve systems of linear equations (see Cramer's rule), but in real applications,Gaussian elimination is a faster method.

Eigenvalues and eigenvectorsIn general, the action of a linear transformation is hard to understand, and so to get a better handle over lineartransformations, those vectors that are relatively fixed by that transformation are given special attention. To makethis more concrete, let be any linear transformation. We are especially interested in those non-zerovectors such that , where is a scalar in the base field of the vector space. These vectors are calledeigenvectors, and the corresponding scalars are called eigenvalues.To find an eigenvector or an eigenvalue, we note that

where is the identity matrix. For there to be nontrivial solutions to that equation, . Thedeterminant is a polynomial, and so the eigenvalues are not guaranteed to exist if the field is R. Thus, we often workwith an algebraically closed field such as the complex numbers when dealing with eigenvectors and eigenvalues sothat an eigenvalue will always exist. It would be particularly nice if given a transformation taking a vector space

into itself we can find a basis for consisting of eigenvectors. If such a basis exists, we can easily compute theaction of the transformation on any vector: if are linearly independent eigenvectors of a mapping ofn-dimensional spaces with (not necessarily distinct) eigenvalues , and if

, then,

Such a transformation is called a diagonalizable matrix since in the eigenbasis, the transformation is represented by adiagonal matrix. Because operations like matrix multiplication, matrix inversion, and determinant calculation aresimple on diagonal matrices, computations involving matrices are much simpler if we can bring the matrix to adiagonal form. Not all matrices are diagonalizable (even over an algebraically closed field), but diagonalizablematrices form a dense subset of all matrices.

Inner-product spacesBesides these basic concepts, linear algebra also studies vector spaces with additional structure, such as an innerproduct. The inner product is an example of a bilinear form, and it gives the vector space a geometric structure byallowing for the definition of length and angles. Formally, an inner product is a map

that satisfies the following three axioms for all vectors and all scalars :[12][13]

• Conjugate symmetry:

Note that in R, it is symmetric.• Linearity in the first argument:

Page 5: Linear Algebra

Linear algebra 5

• Positive-definiteness:

with equality only for We can define the length of a vector by , and we can prove the Cauchy–Schwartzinequality:

In particular, the quantity

and so we can call this quantity the cosine of the angle between the two vectors.

Two vectors are orthogonal if . An orthonormal basis is a basis where all basis vectors have length 1and are orthogonal to each other. Given any finite-dimensional vector space, an orthonormal basis could be found bythe Gram–Schmidt procedure. Orthonormal bases are particularly nice to deal with, since if

, then .The inner product facilitates the construction of many useful concepts. For instance, given a transform , we candefine its Hermitian conjugate as the linear transform satisfying

If T satisfies , we call T normal. It turns out that normal matrices are precisely the matrices that havean orthonormal system of eigenvectors that span V.

Some main useful theorems• A matrix is invertible, or non-singular, if and only if the linear map represented by the matrix is an isomorphism.• Any vector space over a field F of dimension n is isomorphic to Fn as a vector space over F.• Corollary: Any two vector spaces over F of the same finite dimension are isomorphic to each other.• A linear map is an isomorphism if and only if the determinant is nonzero.

ApplicationsBecause of the ubiquity of vector spaces, linear algebra is used in many fields of mathematics, natural sciences,computer science, and social science. Below are just some examples of applications of linear algebra.

Solution of linear systemsLinear algebra provides the formal setting for the linear combination of equations used in the Gaussian method.Suppose the goal is to find and describe the solution(s), if any, of the following system of linear equations:

The Gaussian-elimination algorithm is as follows: eliminate x from all equations below , and then eliminate yfrom all equations below . This will put the system into triangular form. Then, using back-substitution, eachunknown can be solved for.In the example, x is eliminated from by adding to . x is then eliminated from by adding to .Formally:

Page 6: Linear Algebra

Linear algebra 6

The result is:

Now y is eliminated from by adding to :

The result is:

This result is a system of linear equations in triangular form, and so the first part of the algorithm is complete.The last part, back-substitution, consists of solving for the knowns in reverse order. It can thus be seen that

Then, can be substituted into , which can then be solved to obtain

Next, z and y can be substituted into , which can be solved to obtain

The system is solved.We can, in general, write any system of linear equations as a matrix equation:

The solution of this system is characterized as follows: first, we find a particular solution of this equation usingGaussian elimination. Then, we compute the solutions of ; that is, we find the nullspace of A. Thesolution set of this equation is given by . If the number of variables equal thenumber of equations, then we can characterize when the system has a unique solution: since N is trivial if and only if

, the equation has a unique solution if and only if . [14]

Least-squares best fit lineThe least squares method is used to determine the best fit line for a set of data.[15] This line will minimize the sum ofthe squares of the residuals.

Fourier series expansion

Fourier series are a representation of a function as a trigonometric series:

This series expansion is extremely useful in solving partial differential equations. In this article, we will not beconcerned with convergence issues; it is nice to note that all continuous functions have a converging Fourier seriesexpansion, and nice enough discontinuous functions have a Fourier series that converges to the function value atmost points.The space of all functions that can be represented by a Fourier series form a vector space (technically speaking, we call functions that have the same Fourier series expansion the "same" function, since two different discontinuous functions might have the same Fourier series). Moreover, this space is also an inner product space with the inner

Page 7: Linear Algebra

Linear algebra 7

product

The functions for and for are an orthonormal basis for thespace of Fourier-expandable functions. We can thus use the tools of linear algebra to find the expansion of anyfunction in this space in terms of these basis functions. For instance, to find the coefficient , we take the innerproduct with :

and by orthonormality, ; that is,

Quantum mechanicsQuantum mechanics is highly inspired by notions in linear algebra. In quantum mechanics, the physical state of aparticle is represented by a vector, and observables (such as momentum, energy, and angular momentum) arerepresented by linear operators on the underlying vector space. More concretely, the wave function of a particledescribes its physical state and lies in the vector space L2 (the functions such that

is finite), and it evolves according to the Schrödinger equation. Energy is

represented as the operator , where V is the potential energy. H is also known as the

Hamiltonian operator. The eigenvalues of H represents the possible energies that can be observed. Given a particle insome state , we can expand into a linear combination of eigenstates of H. The component of H in eacheigenstate determines the probability of measuring the corresponding eigenvalue, and the measurement forces theparticle to assume that eigenstate (wave function collapse).

Generalizations and related topicsSince linear algebra is a successful theory, its methods have been developed and generalized in other parts ofmathematics. In module theory, one replaces the field of scalars by a ring. The concepts of linear independence,span, basis, and dimension (which is called rank in module theory) still make sense. Nevertheless, many theoremsfrom linear algebra become false in module theory. For instance, not all modules have a basis (those that do arecalled free modules), the rank of a free module is not necessarily unique, not all linearly independent subsets of amodule can be extended to form a basis, and not all subsets of a module that span the space contains a basis.In multilinear algebra, one considers multivariable linear transformations, that is, mappings that are linear in each ofa number of different variables. This line of inquiry naturally leads to the idea of the dual space, the vector space

consisting of linear maps where F is the field of scalars. Multilinear maps can bedescribed via tensor products of elements of .If, in addition to vector addition and scalar multiplication, there is a bilinear vector product, then the vector space iscalled an algebra; for instance, associative algebras are algebras with an associate vector product (like the algebra ofsquare matrices, or the algebra of polynomials).Functional analysis mixes the methods of linear algebra with those of mathematical analysis and studies variousfunction spaces, such as Lp spaces.Representation theory studies the actions of algebraic objects on vector spaces by representing these objects asmatrices. It is interested in all the ways that this is possible, and it does so by finding subspaces invariant under alltransformations of the algebra. The concept of eigenvalues and eigenvectors is especially important.

Page 8: Linear Algebra

Linear algebra 8

Notes[1] Weisstein, Eric. "Linear Algebra" (http:/ / mathworld. wolfram. com/ LinearAlgebra. html). From MathWorld--A Wolfram Web Resource..

Wolfram. . Retrieved 16 April 2012.[2] Vitulli, Marie. "A Brief History of Linear Algebra and Matrix Theory" (http:/ / darkwing. uoregon. edu/ ~vitulli/ 441. sp04/ LinAlgHistory.

html). Department of Mathematics. University of Oregon. . Retrieved 01/24/2012.[3][3] Vitulli, Marie[4][4] Vitulli, Marie[5][5] Vitulli, Marie[6] Roman 2005, ch. 1, p. 27[7] This axiom is not asserting the associativity of an operation, since there are two operations in question, scalar multiplication: bv; and field

multiplication: ab.[8] Axler (2004), pp. 28–29[9] The existence of a basis is straightforward for countably generated vector spaces, and for well-ordered vector spaces, but in full generality it

is logically equivalent to the axiom of choice.[10][10] Axler (2204), p. 33[11][11] Axler (2004), p. 55[12] P. K. Jain, Khalil Ahmad (1995). "5.1 Definitions and basic properties of inner product spaces and Hilbert spaces" (http:/ / books. google.

com/ ?id=yZ68h97pnAkC& pg=PA203). Functional analysis (2nd ed.). New Age International. p. 203. ISBN 812240801X. .[13] Eduard Prugovec̆ki (1981). "Definition 2.1" (http:/ / books. google. com/ ?id=GxmQxn2PF3IC& pg=PA18). Quantum mechanics in Hilbert

space (2nd ed.). Academic Press. pp. 18 ff. ISBN 012566060X. .[14] Gunawardena, Jeremy. "Matrix algebra for beginners, Part I" (http:/ / vcp. med. harvard. edu/ papers/ matrices-1. pdf). Harvard Medical

School. . Retrieved 2 May 2012.[15] Miller, Steven. "The Method of Least Squares" (http:/ / web. williams. edu/ go/ math/ sjmiller/ public_html/ BrownClasses/ 54/ handouts/

MethodLeastSquares. pdf). Brown University. . Retrieved 3 May 2012.

Further readingHistory• Fearnley-Sander, Desmond, "Hermann Grassmann and the Creation of Linear Algebra" ( (http:/ / mathdl. maa.

org/ images/ upload_library/ 22/ Ford/ DesmondFearnleySander. pdf)), American Mathematical Monthly 86(1979), pp. 809–817.

• Grassmann, Hermann, Die lineale Ausdehnungslehre ein neuer Zweig der Mathematik: dargestellt und durchAnwendungen auf die übrigen Zweige der Mathematik, wie auch auf die Statik, Mechanik, die Lehre vomMagnetismus und die Krystallonomie erläutert, O. Wigand, Leipzig, 1844.

Introductory textbooks• Bretscher, Otto (June 28, 2004), Linear Algebra with Applications (3rd ed.), Prentice Hall,

ISBN 978-0131453340• Farin, Gerald; Hansford, Dianne (December 15, 2004), Practical Linear Algebra: A Geometry Toolbox, AK

Peters, ISBN 978-1568812342• Friedberg, Stephen H.; Insel, Arnold J.; Spence, Lawrence E. (November 11, 2002), Linear Algebra (4th ed.),

Prentice Hall, ISBN 978-0130084514• Hefferon, Jim (2008), Linear Algebra (http:/ / joshua. smcvt. edu/ linearalgebra/ )• Anton, Howard (2005), Elementary Linear Algebra (Applications Version) (9th ed.), Wiley International• Lay, David C. (August 22, 2005), Linear Algebra and Its Applications (3rd ed.), Addison Wesley,

ISBN 978-0321287137• Kolman, Bernard; Hill, David R. (May 3, 2007), Elementary Linear Algebra with Applications (9th ed.), Prentice

Hall, ISBN 978-0132296540• Leon, Steven J. (2006), Linear Algebra With Applications (7th ed.), Pearson Prentice Hall,

ISBN 978-0131857858• Poole, David (2010), Linear Algebra: A Modern Introduction (3rd ed.), Cengage – Brooks/Cole,

ISBN 978-0538735452

Page 9: Linear Algebra

Linear algebra 9

• Ricardo, Henry (2010), A Modern Introduction To Linear Algebra (1st ed.), CRC Press,ISBN 978-1-4398-0040-9

• Strang, Gilbert (July 19, 2005), Linear Algebra and Its Applications (4th ed.), Brooks Cole,ISBN 978-0030105678

Advanced textbooks• Axler, Sheldon (February 26, 2004), Linear Algebra Done Right (2nd ed.), Springer, ISBN 978-0387982588• Bhatia, Rajendra (November 15, 1996), Matrix Analysis, Graduate Texts in Mathematics, Springer,

ISBN 978-0387948461• Demmel, James W. (August 1, 1997), Applied Numerical Linear Algebra, SIAM, ISBN 978-0898713893• Gantmacher, F.R. (2005, 1959 edition), Applications of the Theory of Matrices, Dover Publications,

ISBN 978-0486445540• Gantmacher, Felix R. (1990), Matrix Theory Vol. 1 (2nd ed.), American Mathematical Society,

ISBN 978-0821813768• Gantmacher, Felix R. (2000), Matrix Theory Vol. 2 (2nd ed.), American Mathematical Society,

ISBN 978-0821826645• Gelfand, I. M. (1989), Lectures on Linear Algebra, Dover Publications, ISBN 978-0486660820• Glazman, I. M.; Ljubic, Ju. I. (2006), Finite-Dimensional Linear Analysis, Dover Publications,

ISBN 978-0486453323• Golan, Johnathan S. (January 2007), The Linear Algebra a Beginning Graduate Student Ought to Know (2nd ed.),

Springer, ISBN 978-1402054945• Golan, Johnathan S. (August 1995), Foundations of Linear Algebra, Kluwer, ISBN 0792336143• Golub, Gene H.; Van Loan, Charles F. (October 15, 1996), Matrix Computations, Johns Hopkins Studies in

Mathematical Sciences (3rd ed.), The Johns Hopkins University Press, ISBN 978-0801854149• Greub, Werner H. (October 16, 1981), Linear Algebra, Graduate Texts in Mathematics (4th ed.), Springer,

ISBN 978-0801854149• Hoffman, Kenneth; Kunze, Ray (April 25, 1971), Linear Algebra (2nd ed.), Prentice Hall, ISBN 978-0135367971• Halmos, Paul R. (August 20, 1993), Finite-Dimensional Vector Spaces, Undergraduate Texts in Mathematics,

Springer, ISBN 978-0387900933• Horn, Roger A.; Johnson, Charles R. (February 23, 1990), Matrix Analysis, Cambridge University Press,

ISBN 978-0521386326• Horn, Roger A.; Johnson, Charles R. (June 24, 1994), Topics in Matrix Analysis, Cambridge University Press,

ISBN 978-0521467131• Lang, Serge (March 9, 2004), Linear Algebra, Undergraduate Texts in Mathematics (3rd ed.), Springer,

ISBN 978-0387964126• Marcus, Marvin; Minc, Henryk (2010), A Survey of Matrix Theory and Matrix Inequalities, Dover Publications,

ISBN 978-0486671024• Meyer, Carl D. (February 15, 2001), Matrix Analysis and Applied Linear Algebra (http:/ / www. matrixanalysis.

com/ DownloadChapters. html), Society for Industrial and Applied Mathematics (SIAM), ISBN 978-0898714548• Mirsky, L. (1990), An Introduction to Linear Algebra, Dover Publications, ISBN 978-0486664347• Roman, Steven (March 22, 2005), Advanced Linear Algebra, Graduate Texts in Mathematics (2nd ed.), Springer,

ISBN 978-0387247663• Shilov, Georgi E. (June 1, 1977), Linear algebra, Dover Publications, ISBN 978-0486635187• Shores, Thomas S. (December 6, 2006), Applied Linear Algebra and Matrix Analysis, Undergraduate Texts in

Mathematics, Springer, ISBN 978-0387331942• Smith, Larry (May 28, 1998), Linear Algebra, Undergraduate Texts in Mathematics, Springer,

ISBN 978-0387984551

Page 10: Linear Algebra

Linear algebra 10

Study guides and outlines• Leduc, Steven A. (May 1, 1996), Linear Algebra (Cliffs Quick Review), Cliffs Notes, ISBN 978-0822053316• Lipschutz, Seymour; Lipson, Marc (December 6, 2000), Schaum's Outline of Linear Algebra (3rd ed.),

McGraw-Hill, ISBN 978-0071362009• Lipschutz, Seymour (January 1, 1989), 3,000 Solved Problems in Linear Algebra, McGraw–Hill,

ISBN 978-0070380233• McMahon, David (October 28, 2005), Linear Algebra Demystified, McGraw–Hill Professional,

ISBN 978-0071465793• Zhang, Fuzhen (April 7, 2009), Linear Algebra: Challenging Problems for Students, The Johns Hopkins

University Press, ISBN 978-0801891250

External links• International Linear Algebra Society (http:/ / www. math. technion. ac. il/ iic/ )• MIT Professor Gilbert Strang's Linear Algebra Course Homepage (http:/ / web. mit. edu/ 18. 06/ www) : MIT

Course Website• MIT Linear Algebra Lectures (http:/ / ocw. mit. edu/ OcwWeb/ Mathematics/ 18-06Spring-2005/ VideoLectures/

index. htm): free videos from MIT OpenCourseWare• Linear Algebra Toolkit (http:/ / www. math. odu. edu/ ~bogacki/ lat/ ).• Linear Algebra (http:/ / mathworld. wolfram. com/ topics/ LinearAlgebra. html) on MathWorld.• Linear Algebra overview (http:/ / planetmath. org/ encyclopedia/ LinearAlgebra. html) and notation summary

(http:/ / planetmath. org/ encyclopedia/ NotationInLinearAlgebra. html) on PlanetMath.• Linear Algebra tutorial (http:/ / people. revoledu. com/ kardi/ tutorial/ LinearAlgebra/ index. html) with online

interactive programs.• Matrix and Linear Algebra Terms (http:/ / www. economics. soton. ac. uk/ staff/ aldrich/ matrices. htm) on

Earliest Known Uses of Some of the Words of Mathematics (http:/ / jeff560. tripod. com/ mathword. html)• Earliest Uses of Symbols for Matrices and Vectors (http:/ / jeff560. tripod. com/ matrices. html) on Earliest Uses

of Various Mathematical Symbols (http:/ / jeff560. tripod. com/ mathsym. html)• Linear Algebra (http:/ / www. egwald. ca/ linearalgebra/ index. php) by Elmer G. Wiens. Interactive web pages

for vectors, matrices, linear equations, etc.• Linear Algebra Solved Problems (http:/ / www. mathlinks. ro/ Forum/ index. php?f=346): Interactive forums for

discussion of linear algebra problems, from the lowest up to the hardest level (Putnam).• Linear Algebra for Informatics (http:/ / xmlearning. maths. ed. ac. uk). José Figueroa-O'Farrill, University of

Edinburgh• Online Notes / Linear Algebra (http:/ / tutorial. math. lamar. edu/ classes/ linalg/ linalg. aspx) Paul Dawkins,

Lamar University• Elementary Linear Algebra textbook with solutions (http:/ / www. numbertheory. org/ book/ )• Linear Algebra Wiki (http:/ / www. linearalgebrawiki. org/ )• Linear algebra (math 21b) homework and exercises (http:/ / www. courses. fas. harvard. edu/ ~math21b/ )• Textbook and solutions manual (http:/ / www. saylor. org/ courses/ ma211/ ), Saylor Foundation.

Page 11: Linear Algebra

Linear algebra 11

Online books• Beezer, Rob, A First Course in Linear Algebra (http:/ / linear. ups. edu/ index. html)• Connell, Edwin H., Elements of Abstract and Linear Algebra (http:/ / www. math. miami. edu/ ~ec/ book/ )• Hefferon, Jim, Linear Algebra (http:/ / joshua. smcvt. edu/ linalg. html/ )• Matthews, Keith, Elementary Linear Algebra (http:/ / www. numbertheory. org/ book/ )• Sharipov, Ruslan, Course of linear algebra and multidimensional geometry (http:/ / arxiv. org/ abs/ math. HO/

0405323)• Treil, Sergei, Linear Algebra Done Wrong (http:/ / www. math. brown. edu/ ~treil/ papers/ LADW/ LADW. html)

Page 12: Linear Algebra

Article Sources and Contributors 12

Article Sources and ContributorsLinear algebra  Source: http://en.wikipedia.org/w/index.php?oldid=491243200  Contributors: 11heyyWhazUp????, 28421u2232nfenfcenc, 2andrewknyazev, APH, AbsolutDan, Abu hamzah,Aenar, AlanUS, Algebraist, Ali'i, Alsandair, Alsandro, Anakata, Angela, Aprock, Arcfrk, Archaeopteryx, Arthena, Arthur Rubin, Auntof6, Avertist, AvicAWB, Avihu, AxelBoldt, Banus,Bcrowell, BenFrantzDale, Betterusername, Bfigura, Bkwillwm, Blainster, Boud, Bovineboy2008, Brad7777, Brian0918, Brion VIBBER, Bryan Derksen, C quest000, CRGreathouse, CSTAR,Centrx, Chaos, Charles Matthews, Chris Vanderpump, Cic, Cjfsyntropy, Clintio, Cmdrjameson, ColdFeet, Conversion script, Cybercobra, DHN, DVdm, Daftman, Damian Yerrick, DanGranahan, Daniel Brockman, DanielKO, Danski14, David Eppstein, Debator of mathematics, Demmy100, Dfrg.msc, Dgpoole, Dhollm, Digby Tantrum, Dirac1933, Discospinster, Dto,Dysprosia, EconoPhysicist, Egil, Emadboctor, Enviroboy, Ergbert, Favonian, Fintor, Firewall62, First Harmonic, Fparnon, Frecklefoot, Fredrik, Fubar Obfusco, GD, GTBacchus, Gadfium,Gandalf61, Garde, General Wesc, Geometry guy, Giftlite, Graham87, Gregbard, Gubbubu, Gwaihir, Hairy Dude, Heron, Hlevkin, Hypercube, Ichudov, Imjustmatthew, Isheden, Isis, Itai, Iulianu,Ivan Štambuk, JPG-GR, JYOuyang, JabberWok, Jasper Chua, Jeff G., Jehan60188, Jfgrcar, Jim.belk, Jitse Niesen, Joeoettinger, JohnBlackburne, Jordgette, Jrdodge, Junaidoon, Kaarebrandt,Karada, Karch, Karnan, Ken Kuniyuki, Komponisto, Korg, L1ttleTr33, LOL, Ladne3, Lethe, Ligulem, Lipedia, LittleWat, Loadedsalt, Luca Balbi, MZMcBride, Majesty of Knowledge,MarcelB612, Mark Krueger, Martin451, MathKnight, MathQuant, Matthew Yeager, Maxdlink, Mctpyt, Mets501, Michael Hardy, Miguel, Mikez, Mouramoor, Msh210, Myasuda, Nakon,Natelewis, Newtonsquared, NickBush24, Ninly, Nitya Dharma, Nuke102094, Oblivious, Odedude, Oleg Alexandrov, Omicronpersei8, OrenBochman, Ozob, Paolo.dL, Parkerjones, Paul August,Peruvianllama, Petemar1, Pjoef, Pmetzger, Q0, Quentin mcalmott, RedWolf, Redgecko, Redrose64, Revolver, Rgamble, Rgdboer, Rick Norwood, Rludlow, Rmilson, Romanm, Rossami,SCZenz, Saaska, Salix alba, Sanju dp, Sarkar112, Schewek, Schzmo, Seaphoto, Sindrin7, Sklange, Snigbrook, So nazzer it's pav, Sp520, Spearhead, Spondoolicks, Sreyan, Staecker,Steve.jaramillov, StradivariusTV, Symane, TVilkesalo, TakuyaMurata, Tbsmith, TeH nOmInAtOr, Teledildonix314, Titoxd, Tomchiukc, Tompw, Toytoy, U beddu sicilianu, Unixarcade,Vaughan Pratt, Versus22, Waltpohl, WelshMatt, Wolfrock, Wshun, X14n, Yiliu60, Youandme, Yuval madar, ZeroOne, Zginder, Zsniew, Zundark, 267 anonymous edits

Image Sources, Licenses and ContributorsFile:Linear subspaces with shading.svg  Source: http://en.wikipedia.org/w/index.php?title=File:Linear_subspaces_with_shading.svg  License: Creative Commons Attribution-Sharealike 3.0 Contributors: Alksentrs at en.wikipedia

LicenseCreative Commons Attribution-Share Alike 3.0 Unported//creativecommons.org/licenses/by-sa/3.0/