Singular Value Decomposition (SVD)
Singular Value Decomposition (SVD)
Section titled “Singular Value Decomposition (SVD)”The Singular Value Decomposition (SVD) is one of the most powerful tools in linear algebra. It is widely used in machine learning for dimensionality reduction, noise reduction, recommendation systems, and data compression.
Definition
Section titled “Definition”Any real matrix can be decomposed as:
- : orthogonal matrix (left singular vectors).
- : diagonal matrix with non-negative values (singular values).
- : orthogonal matrix (right singular vectors).
::: info Explanation of SVD
- Singular values represent the importance (variance captured) of each dimension.
- gives the directions in the original space.
- gives the directions in feature space.
- Truncating gives low-rank approximations of .
:::
Mini Example
Section titled “Mini Example”Let:
Performing SVD:
- contains orthonormal basis vectors in row space.
- has singular values .
- contains orthonormal basis vectors in column space.
Applications in ML
Section titled “Applications in ML”- PCA: PCA is essentially SVD on the centered data matrix.
- Latent Semantic Analysis (LSA): Uses SVD in NLP for dimensionality reduction.
- Recommendation Systems: Matrix factorization with SVD finds latent features of users and items.
- Noise Reduction: Keep only top singular values to approximate data.
Hands-on with Python and Rust
Section titled “Hands-on with Python and Rust”::: code-group
import numpy as np
A = np.array([[1, 0], [0, 1], [0, 0]])
# Perform SVDU, S, Vt = np.linalg.svd(A)
print("U =\n", U)print("Singular values =", S)print("V^T =\n", Vt)use ndarray::array;use ndarray::Array2;use ndarray_linalg::SVD;
fn main() { let a: Array2<f64> = array![ [1.0, 0.0], [0.0, 1.0], [0.0, 0.0] ];
// Perform SVD let (u, s, vt) = a.svd(true, true).unwrap();
println!("U = {:?}", u.unwrap()); println!("Singular values = {:?}", s); println!("V^T = {:?}", vt.unwrap());}:::
Connection to ML
Section titled “Connection to ML”- Data compression → keep only top singular values.
- Noise filtering → discard small singular values.
- Feature extraction → reduced representations (PCA, LSA).
Next Steps
Section titled “Next Steps”Continue to Positive Semi-Definite Matrices and Covariance.