- Hands-On Mathematics for Deep Learning
- Jay Dawani
- 223字
- 2024-10-30 02:24:30
Orthogonal matrices
The concept of orthogonality arises frequently in linear algebra. It's really just a fancy word for perpendicularity, except it goes beyond two dimensions or a pair of vectors.
But to get an understanding, let's start with two column vectors . If they are orthogonal, then the following holds:
.
Orthogonal matrices are a special kind of matrix where the columns are pairwise orthonormal. What this means is that we have a matrix with the following property:
Then, we can deduce that (that is, the transpose of Q is also the inverse of Q).
As with other types of matrices, orthogonal matrices have some special properties.
Firstly, they preserve inner products, so that the following applies:
.
This brings us to the second property, which states that 2-norms are preserved for orthogonal matrices, which we see as follows:
When multiplying by orthogonal matrices, you can think of it as a transformation that preserves length, but the vector may be rotated about the origin by some degree.
The most well-known orthogonal matrix that is also orthonormal is a special matrix we have dealt with a few times already. It is the identity matrix I, and since it represents a unit of length in the direction of axes, we generally refer to it as the standard basis.