QR decomposition

From Canonica AI

Introduction

In linear algebra, the QR decomposition (also known as QR factorization) is a decomposition of a matrix into a product of an orthogonal matrix and an upper triangular matrix. This technique is widely used in numerical linear algebra for solving linear systems, eigenvalue problems, and least squares fitting. The QR decomposition is particularly valuable because it provides a stable and efficient method for matrix factorization.

Definition and Notation

Given a matrix \( A \) of size \( m \times n \) (with \( m \geq n \)), the QR decomposition expresses \( A \) as the product of two matrices: \[ A = QR \] where: - \( Q \) is an \( m \times m \) orthogonal matrix (i.e., \( Q^T Q = I \)). - \( R \) is an \( m \times n \) upper triangular matrix.

For a square matrix \( A \) of size \( n \times n \), the matrix \( Q \) is also square and orthogonal, and \( R \) is an upper triangular matrix of the same size.

Properties of QR Decomposition

The QR decomposition has several important properties: - **Orthogonality**: The columns of \( Q \) are orthonormal vectors, meaning \( Q^T Q = I \). - **Upper Triangular Matrix**: The matrix \( R \) is upper triangular, meaning all elements below the main diagonal are zero. - **Uniqueness**: If \( A \) is a full rank matrix, the QR decomposition is unique if we require the diagonal elements of \( R \) to be positive.

Methods of QR Decomposition

There are several methods to compute the QR decomposition, each with its own advantages and computational complexities.

Gram-Schmidt Process

The Gram-Schmidt process is a classical method for QR decomposition. It involves orthogonalizing the columns of the matrix \( A \) to form the matrix \( Q \) and then computing \( R \) as follows: 1. Initialize \( Q \) as an empty matrix. 2. For each column \( a_i \) of \( A \):

  - Subtract the projection of \( a_i \) onto each column of \( Q \).
  - Normalize the resulting vector to form the new column of \( Q \).

3. Construct \( R \) by taking the inner products of the columns of \( Q \) with the columns of \( A \).

Householder Transformations

Householder transformations provide a more numerically stable method for QR decomposition. A Householder transformation is a reflection that can zero out the subdiagonal elements of a column vector. The process involves: 1. For each column \( k \) of \( A \):

  - Construct a Householder matrix \( H_k \) that zeros out the subdiagonal elements of the \( k \)-th column.
  - Apply \( H_k \) to \( A \) to form a new matrix with the desired upper triangular structure.

2. The product of all Householder matrices forms the orthogonal matrix \( Q \), and the resulting upper triangular matrix is \( R \).

Givens Rotations

Givens rotations are another method for QR decomposition, particularly useful for sparse matrices. A Givens rotation zeroes out individual elements of a matrix using rotation matrices. The process involves: 1. For each element below the diagonal in \( A \):

  - Construct a Givens rotation matrix \( G \) that zeroes out the element.
  - Apply \( G \) to \( A \) to form a new matrix with the desired upper triangular structure.

2. The product of all Givens rotation matrices forms the orthogonal matrix \( Q \), and the resulting upper triangular matrix is \( R \).

Applications of QR Decomposition

QR decomposition has a wide range of applications in numerical linear algebra and beyond.

Solving Linear Systems

QR decomposition can be used to solve linear systems of the form \( Ax = b \). By decomposing \( A \) into \( QR \), the system can be rewritten as: \[ QRx = b \] Multiplying both sides by \( Q^T \) gives: \[ Rx = Q^T b \] Since \( R \) is upper triangular, this system can be solved efficiently using back substitution.

Eigenvalue Problems

QR decomposition is a key step in the QR algorithm for finding eigenvalues of a matrix. The QR algorithm iteratively applies QR decomposition to a matrix and updates the matrix as follows: 1. Decompose \( A \) into \( QR \). 2. Form the new matrix \( A' = RQ \). 3. Repeat the process until \( A' \) converges to an upper triangular matrix, whose diagonal elements are the eigenvalues of the original matrix.

Least Squares Fitting

In least squares fitting, QR decomposition provides a stable method for solving the normal equations. Given an overdetermined system \( Ax = b \), the least squares solution minimizes the residual \( \|Ax - b\| \). By decomposing \( A \) into \( QR \), the problem can be transformed into solving: \[ R x = Q^T b \] which can be solved efficiently using back substitution.

Numerical Stability and Efficiency

The numerical stability and efficiency of QR decomposition methods vary. Householder transformations are generally more stable than the Gram-Schmidt process, especially in the presence of round-off errors. Givens rotations are particularly useful for sparse matrices, as they can be applied selectively to zero out specific elements.

Computational Complexity

The computational complexity of QR decomposition depends on the method used: - **Gram-Schmidt Process**: \( O(mn^2) \) - **Householder Transformations**: \( O(mn^2 - \frac{1}{3}n^3) \) - **Givens Rotations**: \( O(mn) \) for sparse matrices

Example

Consider the matrix \( A \): \[ A = \begin{pmatrix} 12 & -51 & 4 \\ 6 & 167 & -68 \\ -4 & 24 & -41 \end{pmatrix} \]

Using the Householder transformation method, we can decompose \( A \) into \( Q \) and \( R \): \[ Q = \begin{pmatrix} 6/7 & -69/175 & -58/175 \\ 3/7 & 158/175 & 6/175 \\ -2/7 & 6/35 & -33/35 \end{pmatrix} \] \[ R = \begin{pmatrix} 14 & 21 & -14 \\ 0 & 175 & -70 \\ 0 & 0 & 35 \end{pmatrix} \]

See Also

References