Orthogonal matrix: Difference between revisions
(Created page with "== Definition and Properties == An '''orthogonal matrix''' is a square matrix whose columns and rows are orthogonal unit vectors, meaning that the matrix multiplied by its transpose results in the identity matrix. Formally, a matrix \( Q \) is orthogonal if \( Q^T Q = Q Q^T = I \), where \( Q^T \) is the transpose of \( Q \) and \( I \) is the identity matrix. Orthogonal matrices have several important properties: * The rows and columns of an orthogonal matrix are orth...") |
No edit summary |
||
Line 50: | Line 50: | ||
* [[Matrix decomposition]] | * [[Matrix decomposition]] | ||
[[Image:Detail-79277.jpg|thumb|center|Illustration of a 3D rotation matrix applied to a vector.]] | |||
== Categories == | == Categories == |
Revision as of 20:33, 17 May 2024
Definition and Properties
An orthogonal matrix is a square matrix whose columns and rows are orthogonal unit vectors, meaning that the matrix multiplied by its transpose results in the identity matrix. Formally, a matrix \( Q \) is orthogonal if \( Q^T Q = Q Q^T = I \), where \( Q^T \) is the transpose of \( Q \) and \( I \) is the identity matrix.
Orthogonal matrices have several important properties:
- The rows and columns of an orthogonal matrix are orthonormal vectors.
- The determinant of an orthogonal matrix is either +1 or -1.
- Orthogonal matrices preserve the dot product, meaning that for any vectors \( u \) and \( v \), \( Q(u \cdot v) = (Qu) \cdot (Qv) \).
- Orthogonal matrices preserve the Euclidean norm, meaning that \( \|Qx\| = \|x\| \) for any vector \( x \).
Examples and Applications
Orthogonal matrices appear in various areas of mathematics and applied sciences. Some common examples include:
- The identity matrix \( I \) is trivially orthogonal.
- Rotation matrices in two and three dimensions are orthogonal.
- Reflection matrices are orthogonal.
Orthogonal matrices are used in numerous applications such as:
- QR decomposition, which is used in solving linear systems and eigenvalue problems.
- Principal Component Analysis (PCA), which is used in statistics and machine learning for dimensionality reduction.
- Computer graphics, where orthogonal matrices are used for transformations like rotations and reflections.
Construction and Decomposition
Orthogonal matrices can be constructed in several ways:
- From Gram-Schmidt process, which orthogonalizes a set of vectors.
- Using Householder transformations, which are used in numerical linear algebra to zero out subdiagonal elements.
- From Givens rotations, which are used to introduce zeros in specific positions of a matrix.
Orthogonal matrices can also be decomposed into simpler components:
- The QR decomposition expresses a matrix as the product of an orthogonal matrix \( Q \) and an upper triangular matrix \( R \).
- The Singular Value Decomposition (SVD) expresses a matrix as the product of two orthogonal matrices and a diagonal matrix.
Properties in Higher Dimensions
In higher dimensions, orthogonal matrices retain their fundamental properties but exhibit more complex behaviors. For instance:
- In \( n \)-dimensions, an orthogonal matrix represents a combination of rotations and reflections.
- The space of all \( n \times n \) orthogonal matrices forms a group under matrix multiplication, known as the Orthogonal group \( O(n) \).
Numerical Stability and Computation
Orthogonal matrices are particularly important in numerical computations due to their stability properties:
- They are used in algorithms that require numerical stability, such as the QR algorithm for eigenvalue computation.
- Orthogonal transformations are used in Least Squares problems to minimize numerical errors.