Undergraduate → Algebra ↓
Linear Algebra
Linear algebra is a branch of mathematics that deals with vectors, vector spaces (also called linear spaces), linear transformations, and systems of linear equations. It forms the foundation of many scientific disciplines and is used in a variety of fields such as physics, computer science, engineering, and others.
Introduction to vector
A vector is an object that has both magnitude and direction. You can think of it as an arrow pointing from one point in space to another. Vectors are usually represented by symbols such as a
, b
, or v
.
Vectors in 2D
In two dimensions, a vector can be represented as an ordered pair of numbers, which are its coordinates. For example, the vector v = (3, 4)
has components 3
in the x-direction and 4
in the y-direction.
v = (3, 4)
Vectors in 3D
In three dimensions, a vector is represented as a triplet of numbers: v = (x, y, z)
. The third component adds depth or height.
v = (3, 4, 5)
Vector operations
Several basic operations can be performed on vectors:
Vector addition
To add two vectors, you just have to add their corresponding components. If a = (a1, a2)
and b = (b1, b2)
, then a + b = (a1 + b1, a2 + b2)
.
a = (3, 4) b = (1, 2) a + b = (3 + 1, 4 + 2) = (4, 6)
Scalar multiplication
Scalar multiplication involves scaling a vector by a number called a scalar. For a vector a = (a1, a2)
and a scalar c
, the product c * a = (c*a1, c*a2)
.
c = 2 a = (3, 4) c * a = (2 * 3, 2 * 4) = (6, 8)
Vector space
A vector space is a collection of vectors that are closed under vector addition and scalar multiplication. This means that if you take two vectors in a vector space and add them, the result is also in the vector space. Similarly, if you take a vector in the space and multiply it by a scalar, the result is still in the vector space.
Base and dimensions
A set of vectors forms a basis for a vector space if every vector in the space can be written as a linear combination of the basis vectors. The number of vectors in the basis is called the dimension of the vector space.
For example, in two-dimensional space, the standard basis is {(1, 0), (0, 1)}
, which has dimension 2.
Matrices
A matrix is a rectangular array of numbers arranged in rows and columns. Matrices are used to represent linear transformations and systems of linear equations.
Matrix representation
The matrix is usually denoted by a capital letter like A
or B
and for a 2x2 matrix it looks like this:
A = | 1 2 | | 3 4 |
Matrix operations
Like vectors, we can perform operations on matrices too such as addition, multiplication and finding the inverse.
Matrix addition
Matrices can be added together if they have the same dimensions. To add two matrices, add their corresponding elements.
A = | 1 2 | | 3 4 | B = | 5 6 | | 7 8 | A + B = | 1+5 2+6 | | 3+7 4+8 | A + B = | 6 8 | | 10 12 |
Matrix multiplication
Matrix multiplication is more complex than addition and involves the dot product of rows and columns. If A
is an mxn
matrix and B
is an nxp
matrix, then the result C = AB
is an mxp
matrix.
A = | 1 2 | | 3 4 | B = | 5 6 | | 7 8 | AB = | (1*5 + 2*7) (1*6 + 2*8) | | (3*5 + 4*7) (3*6 + 4*8) | AB = | 19 22 | | 43 50 |
Linear transforms
A linear transformation is a function between two vector spaces that preserves vector addition and scalar multiplication. If T
is a linear transformation, then for vectors u
and v
and a scalar c
, the following conditions hold: T(u + v) = T(u) + T(v)
and T(c * u) = c * T(u)
.
Determinants and inverses
Determinants
The determinant is a special number that can be calculated from a square matrix. It provides important information about the matrix, such as whether or not it is invertible and the amount of transformation described by the matrix.
A = | ab | | cd | det(A) = ad - bc
Inverse
The inverse of a matrix A
is another matrix, denoted by A^-1
, so that when multiplied together, they yield the identity matrix. Not all matrices have inverses, and a matrix must be square and have a non-zero determinant to be invertible.
A = | 1 2 | | 3 4 | A^-1 = (1/det(A)) * | d -b | | -ca | A^-1 = (1/(1*4 - 2*3)) * | 4 -2 | | -3 1 | A^-1 = | -2 1 | | 1.5 -0.5 |
Eigenvalues and eigenvectors
Eigenvalues and eigenvectors are important concepts in linear algebra and have applications in many areas of science and engineering. An eigenvector of a square matrix A
is a non-zero vector v
such that when A
is multiplied by v
, the result is a scalar multiple of v
. This scalar is called the eigenvalue.
Av = λv where: A = matrix v = eigenvector λ = eigenvalue
Applications of linear algebra
Linear algebra has many applications in various fields:
- Computer Graphics: Linear algebra is widely used in computer graphics to represent and manipulate images and 3D models. Transformations such as scaling, rotation, and translation are easily performed using matrices.
- Engineering: Engineers use linear algebra to solve systems of linear equations that describe physical phenomena such as electrical circuits and mechanical structures.
- Physics: Linear algebra is used in physics to mathematically model the physical world. Linear algebra is widely used in topics such as quantum mechanics and relativity.
- Economics: Economists use linear algebra to analyze economic models and optimize production and distribution processes.
Conclusion
Linear algebra is a fundamental area of mathematics that underlies many modern scientific and engineering disciplines. Understanding basic concepts such as vectors, matrices, linear transformations, and the operations that can be performed on them provides a powerful toolkit for solving complex real-world problems.