Eigenvalues and Eigenvectors
Eigenvalues and eigenvectors are fundamental concepts in linear algebra with deep applications in machine learning, such as PCA (Principal Component Analysis), dimensionality reduction, and stability analysis of algorithms.
Definition
For a square matrix
where
gives a direction that is unchanged (up to scaling) by . tells how much the vector is stretched or shrunk.
Explanation of Eigenvalues & Eigenvectors
Think of a matrix
- In ML: PCA finds eigenvectors of the covariance matrix → principal directions of data variance.
Mini Example
If
So
Similarly,
Characteristic Equation
Eigenvalues are found by solving:
This yields an
Hands-on with Python and Rust
import numpy as np
A = np.array([[2, 0], [0, 3]])
# Eigenvalues and eigenvectors
eigenvalues, eigenvectors = np.linalg.eig(A)
print("Eigenvalues:", eigenvalues)
print("Eigenvectors:\n", eigenvectors)
use ndarray::array;
use ndarray::Array2;
use ndarray_linalg::Eig;
fn main() {
let a: Array2<f64> = array![
[2.0, 0.0],
[0.0, 3.0]
];
// Eigen decomposition
let (eigenvalues, eigenvectors) = a.eig().unwrap();
println!("Eigenvalues: {:?}", eigenvalues);
println!("Eigenvectors:\n{:?}", eigenvectors);
}
Connection to ML
- PCA → uses eigenvectors of covariance matrix to find directions of maximum variance.
- Spectral clustering → uses eigenvalues of Laplacian matrices.
- Stability analysis → eigenvalues determine convergence rates of iterative methods.
Next Steps
Continue to Singular Value Decomposition (SVD).