Linear transformation

From Canonica AI

Definition and Basic Properties

In the field of mathematics, specifically in linear algebra, a linear transformation (also known as linear map or linear function) is a mapping V → W between two vector spaces that preserves the operations of addition and scalar multiplication. This means that for any vectors u, v in V and any scalar c, the following two conditions must hold:

1. T(u + v) = T(u) + T(v) 2. T(cu) = cT(u)

The concept of linear transformations is central to modern mathematics and has extensive applications in areas such as physics, engineering, and computer science.

Matrix Representation

Every linear transformation can be represented by a matrix. This is a powerful tool because it allows us to use the techniques of matrix algebra to study linear transformations. The matrix of a linear transformation T: V → W with respect to bases of V and W is a rectangular array whose entries are determined by the images under T of the vectors in the basis of V.

A matrix with rows and columns filled with numbers, representing a linear transformation.
A matrix with rows and columns filled with numbers, representing a linear transformation.

Kernel and Range

Two important subsets associated with a linear transformation T: V → W are the kernel (or null space) and the range (or image). The kernel of T is the set of all vectors in V that T sends to the zero vector in W. The range of T is the set of all vectors in W that T sends some vector in V to.

Invertibility and Isomorphisms

A linear transformation T: V → W is said to be invertible if there exists a linear transformation S: W → V such that ST = Id_V and TS = Id_W, where Id_V and Id_W are the identity transformations on V and W respectively. If such an S exists, it is unique and is called the inverse of T. If a linear transformation has an inverse, it is an isomorphism. Isomorphisms are important in the study of vector spaces because they preserve the structure of the spaces and allow us to consider them as essentially the same.

Eigenvalues and Eigenvectors

If T: V → V is a linear transformation from a vector space to itself, then a non-zero vector v in V is an eigenvector for T if T(v) = λv for some scalar λ. The scalar λ is called an eigenvalue of T. The concepts of eigenvalues and eigenvectors have many applications in different areas of mathematics and its applications, including differential equations, physics, and computer science.

Linear Transformations and Coordinate Changes

Linear transformations are closely related to the concept of changing coordinates. If we have a linear transformation T: V → W and we change the coordinates in V and W, then the matrix of T with respect to the new coordinates is related to the matrix of T with respect to the old coordinates by a certain formula. This formula involves the matrices of the coordinate changes.

See Also