Orthogonal matrix

From Canonica AI

Definition and Properties

An orthogonal matrix is a square matrix whose columns and rows are orthogonal unit vectors, meaning that the matrix multiplied by its transpose results in the identity matrix. Formally, a matrix \( Q \) is orthogonal if \( Q^T Q = Q Q^T = I \), where \( Q^T \) is the transpose of \( Q \) and \( I \) is the identity matrix.

Orthogonal matrices have several important properties:

  • The rows and columns of an orthogonal matrix are orthonormal vectors.
  • The determinant of an orthogonal matrix is either +1 or -1.
  • Orthogonal matrices preserve the dot product, meaning that for any vectors \( u \) and \( v \), \( Q(u \cdot v) = (Qu) \cdot (Qv) \).
  • Orthogonal matrices preserve the Euclidean norm, meaning that \( \|Qx\| = \|x\| \) for any vector \( x \).

Examples and Applications

Orthogonal matrices appear in various areas of mathematics and applied sciences. Some common examples include:

  • The identity matrix \( I \) is trivially orthogonal.
  • Rotation matrices in two and three dimensions are orthogonal.
  • Reflection matrices are orthogonal.

Orthogonal matrices are used in numerous applications such as:

Construction and Decomposition

Orthogonal matrices can be constructed in several ways:

Orthogonal matrices can also be decomposed into simpler components:

  • The QR decomposition expresses a matrix as the product of an orthogonal matrix \( Q \) and an upper triangular matrix \( R \).
  • The Singular Value Decomposition (SVD) expresses a matrix as the product of two orthogonal matrices and a diagonal matrix.

Properties in Higher Dimensions

In higher dimensions, orthogonal matrices retain their fundamental properties but exhibit more complex behaviors. For instance:

  • In \( n \)-dimensions, an orthogonal matrix represents a combination of rotations and reflections.
  • The space of all \( n \times n \) orthogonal matrices forms a group under matrix multiplication, known as the Orthogonal group \( O(n) \).

Numerical Stability and Computation

Orthogonal matrices are particularly important in numerical computations due to their stability properties:

  • They are used in algorithms that require numerical stability, such as the QR algorithm for eigenvalue computation.
  • Orthogonal transformations are used in Least Squares problems to minimize numerical errors.

See Also

Illustration of a 3D rotation matrix applied to a vector.
Illustration of a 3D rotation matrix applied to a vector.

Categories