Day03 Basic Mathematics Review (3)
Eigendecomposition, Symmetric Matrix, and Eigendecomposition
(Korean Explanation from https://darkpgmr.tistory.com/105)
Eigendecomposition
Eigenvalue decomposition is a mathematical process used in linear algebra to decompose a square matrix into its constituent parts. This process reveals many of the matrixโs properties, such as whether it can be inverted, its determinant, and its rank.
Eigendecomposition plays a crucial role in engineering, physics, and data science. Itโs a key tool in PCA(Principal Component Analysis), linear transformations, and systems of differential equations.
If a square matrix $A$ of size $n \times n$ has $n$ (1) linearly independent (2) eigenvectors, then $A$ can be decomposed into $A= PDP^{-1}$ where $P$ is the matrix formed by placing the eigenvectors as its columns, $D$ is a diagonal matrix whose diagonal elements are the corresponding eigenvalues, and $P^{-1}$ is the inverse of the matrix $P$.
Reference in Korean
์ฆ, ํ๋ ฌ A์ ๊ณ ์ ๋ฒกํฐeigenvectors๋ค์ ์ด๋ฒกํฐ๋ก ํ๋ ํ๋ ฌ์ P, ๊ณ ์ ๊ฐeigenvalues๋ค์ ๋๊ฐ์์diagonal elements ํ๋ ๋๊ฐํ๋ ฌdiagonal matrix์ $D$๋ผ ํ๋ฉด ๋ค์ ์์ด ์ฑ๋ฆฝํ๋ค.
\(AP = PD \\
A = PDP^{-1}\)
์ด์ ๊ฐ์ด ํ๋ ฌ A๋ ์์ ์ ๊ณ ์ ๋ฒกํฐ๋ค์ ์ด๋ฒกํฐ๋ก ํ๋ ํ๋ ฌ๊ณผ ๊ณ ์ ๊ฐ์ ๋๊ฐ์์๋ก ํ๋ ํ๋ ฌ์ ๊ณฑ์ผ๋ก ๋๊ฐํ ๋ถํด๊ฐ ๊ฐ๋ฅํ๋ฐ, ์ด๋ฌํ ๋๊ฐํ ๋ถํด๋ฅผ eigendecomposition ์ด๋ผ๊ณ ํ๋ค. ํ๋ ฌ A์ eigendecomposition์ ์๋ฉด ํ๋ ฌ์ ๊ฐ $det(A)$, A์ ๊ฑฐ๋ญ์ ๊ณฑ, ์ญํ๋ ฌ, ๋๊ฐํฉ(trace), ํ๋ ฌ์ ๋คํญ์ ๋ฑ์ ๋งค์ฐ ์์ฝ๊ฒ ๊ณ์ฐํ ์ ์๋ค.)
Linear (in)dependence
- Linear independence: No vector is a linear combination of the other vectors
- Common cause: One of the vectors is a null vector, or the vectors are perpendicular to each other
(์ด๋ค ๋ฒกํฐ๋ ๋ค๋ฅธ ๋ฒกํฐ๋ค์ ์์๋ฐฐ ํฉ์ผ๋ก ํํ๋ ์ ์์ผ๋ฉด ์ด ๋ฒกํฐ๋ค์ ์๋ก ์ผ์ฐจ๋ ๋ฆฝ(linearly independent)์ด๋ผ๊ณ ํ๋ค. ์ฃผ์์ฌํญ: ์ด๋ค ํ๋ ฌ์ ๋ํด ๊ณ ์ ๊ฐ์ ์ ์ผํ๊ฒ ๊ฒฐ์ ๋์ง๋ง ๊ณ ์ ๋ฒกํฐ๋ ์ ์ผํ์ง ์๋ค. ๋ฐ๋ผ์ ๊ณ ์ ๋ฒกํฐ๋ ๋ช๊ฐ์ง ์ ์ฝ์กฐ๊ฑด์ ๋ง์กฑํ๋ ๋ฒกํฐ๋ค ์ค์์ ์ด๋ ๋ฒกํฐ๋ฅผ ์ฌ์ฉํด๋ ๋ฌด๋ฐฉํ๋ ๋ณดํต์ ๋ฒกํฐ์ ํฌ๊ธฐ๋ฅผ 1๋ก ์ ๊ทํํ(normalized) ๋จ์๋ฒกํฐ๋ฅผ ๊ณ ์ ๋ฒกํฐ๋ก ์ก๋ ๊ฒ์ด ์ผ๋ฐ์ ์ด๋ค.)
Symmetric Matrix and Eigendecomposition
A symmetric matrix is a square matrix equal to its transpose $A = PDP^{-1}$โ. The eigenvalue decomposition of symmetric matrices has some special properties that make it useful in practical applications. \(A = PDP^{-1} \\ = PDP^{T}\\ for \ PP^{T} = E \ (P^{-1}=P^{T})\)
-
Properties of Symmetric Matrices in Eigendecomposition
-
Real Eigenvalues: The eigenvalues of a symmetric matrix are always real numbers, even though the matrix might contain complex numbers. This property is particularly useful because it simplifies many problems in physics and engineering where real solutions are required.
-
Orthogonal Eigenvectors: For any pair of eigenvectors $v_i$ and $v_j$ from a symmetric matrix corresponding to different eigenvalues, the eigenvectors are orthogonal. That is, the dot product $v_i \times v_j = 0$ for $i \neq j$โ. When eigenvectors are orthogonal, they lie 90 degrees to each other in the vector space.
- Dimensionality: Each eigenvector defines a unique dimension in the space. The fact that these dimensions are orthogonal ensures that they are independent of one another. This is a highly desirable property in many applications, like PCA, where we want to capture independent directions of variance.
- Simplification: Orthogonal directions simplify computation and reduce numerical errors in calculations, especially *transformations and projections in high-dimensional spaces.*
-
Orthogonal Matrix: For a symmetric matrix $A$, if $P$ is the matrix whose columns are the eigenvectors of $A$, then $P$ is an orthogonal matrix. An orthogonal matrix has a special property: $P^T = P^{-1}$. This equality implies that transposing the matrix $P$โ is equivalent to inverting it.
-
Reference in Korean
๋จผ์ , ๋ฒกํฐ์ ๋ํด ์๊ธฐ๋ฅผ ํด ๋ณด๋ฉด ๋ ๋ฒกํฐ $v_1$, $v_2$๊ฐ ์๋ก ์์ง์ด๋ฉด(์ฆ, $v_1ยทv_2=0$) ๋ ๋ฒกํฐ $v_1$, $v_2$๋ ์๋ก orthogonal ํ๋ค๊ณ ํ๋ค. ๊ทธ๋ฆฌ๊ณ $v^{โ}$ = $\frac{v} {\Vert v \Vert}$์ ๊ฐ์ด ์ด๋ค ๋ฒกํฐ๋ฅผ ํฌ๊ธฐ๊ฐ 1์ธ ๋จ์๋ฒกํฐ๋ก ๋ง๋๋ ๊ฒ์ ์ ๊ทํ(normalization)๋ผ๊ณ ํ๋ค. orthonormal์ด๋ผ๋ ๋ง์ orthogonal๊ณผ normal์ด ํฉ์ณ์ง ๋ง๋ก์ ๋ ๋ฒกํฐ $v_1$, $v_2$๊ฐ ๋ชจ๋ ๋จ์๋ฒกํฐ(unit vector)์ด๋ฉด์ $v_1$, $v_2$๋ orthonormal(์ ๊ท์ง๊ต)ํ๋ค๊ณ ํ๋ค.
orthogonal: $v_1ยทv_2= 0$
orthonormal: $v_1ยทv_2 = 0$ & $\lVert v_1 \rVert =1$, $\Vert v_2 \Vert = 1$
์ฆ, orthogonal, orthonormal์ ๋ฒกํฐ๋ค ์ฌ์ด์ ๊ด๊ณ๋ฅผ ๋ํ๋ด๋ ๋ง์ธ๋ฐ, ์ด๊ฒ ํ๋ ฌ๋ก ๋์ด๊ฐ๋ฉด ์กฐ๊ธ ์๋ฏธ๊ฐ ๋ฌ๋ผ์ง๋ค.
์ง๊ตํ๋ ฌ(orthogonal matrix)์ ์ํ์ ์ ์๋ ์์ ์ ์ ์นํ๋ ฌ(transpose)๋ฅผ ์ญํ๋ ฌ๋ก ๊ฐ๋ ์ ๋ฐฉํ๋ ฌ์ด๋ค.
\[A^{-1} = A^T\]์ด์ ๊ฐ์ด ์ง๊ตํ๋ ฌ(orthogonal matrix)์ transpose๋ฅผ ์ํค๋ฉด(ํ๋ ฌ์ ์ด๊ณผ ํ ์์๋ค์ ์๋ก ๋ฐ๊พธ๋ฉด) ์์ ์ ์ญํ๋ ฌ์ด ๋๊ธฐ ๋๋ฌธ์ ๋ค์ํ ์ ํ๋์ํ ๊ณ์ฐ์์ ๋งค์ฐ ํธ๋ฆฌํ ์ฑ์ง์ ๊ฐ์ง ํ๋ ฌ์ด๋ค.
๊ทธ๋ฐ๋ฐ, ์ง๊ตํ๋ ฌ์ ์ด๋ฒกํฐ๋ค์ ์๋ก orthonomal(์ ๊ท์ง๊ต)ํ ์ฑ์ง์ ๊ฐ์ง๊ณ ์๋ค. ์ฆ, ์ง๊ต ํ๋ ฌ๋ฅผ ๊ตฌ์ฑํ๋ ์ด๋ฒกํฐ๋ค์ $v_1, v_2, .., v_n$์ด๋ผ ํ์ ๋ ์ด๋ค์ ๋ชจ๋ ๋จ์๋ฒกํฐ(unit vector)์ด๋ฉด์ ๋ํ ์๋ก ์๋ก ์์ง์ธ ์ฑ์ง์ ๊ฐ๋๋ค. ์ด๋ฌํ ์ฑ์ง์ ์ด๋ฒกํฐ๊ฐ ์๋ ํ๋ฒกํฐ๋ค์ ๋ํด์๋ ๋์ผํ๊ฒ ์ฑ๋ฆฝํ๋ค (์ฆ, ํ๋ฒกํฐ๋ค๋ ์๋ก orthonormal ํ๋ค).
์ฆ, ์ง๊ตํ๋ ฌ(orthogonal matrix)์ ๊ทธ ํ๋ ฌ์ ๊ตฌ์ฑํ๋ ์ด๋ฒกํฐ(ํ๋ฒกํฐ)๋ค์ด ์๋ก ์์ง(orthogonal)์ด๋ฉด์ ํฌ๊ธฐ๊ฐ 1์ธ (normalํ) ํ๋ ฌ๋ก๋ ์ ์๋ ์ ์๋ค.
Leave a comment