Skip to content

Eigenvalues and Eigenvectors

Eigenvalues and eigenvectors are fundamental concepts in linear algebra with deep applications in machine learning, such as PCA (Principal Component Analysis), dimensionality reduction, and stability analysis of algorithms.


For a square matrix ARn×nA \in \mathbb{R}^{n \times n}, a nonzero vector vv is an eigenvector if:

Av=λvA v = \lambda v

where λ\lambda is the corresponding eigenvalue.

  • vv gives a direction that is unchanged (up to scaling) by AA.
  • λ\lambda tells how much the vector is stretched or shrunk.

::: info Explanation of Eigenvalues & Eigenvectors Think of a matrix AA as a transformation. Most vectors change direction when transformed, but eigenvectors keep their direction, only scaling by λ\lambda.

  • In ML: PCA finds eigenvectors of the covariance matrix → principal directions of data variance.
    :::

A=[2003]A = \begin{bmatrix} 2 & 0 \\ 0 & 3 \end{bmatrix}

If v=[1,0]Tv = [1, 0]^T, then:

Av=[2003][10]=[20]=2vAv = \begin{bmatrix} 2 & 0 \\ 0 & 3 \end{bmatrix} \begin{bmatrix} 1 \\ 0 \end{bmatrix} = \begin{bmatrix} 2 \\ 0 \end{bmatrix} = 2v

So vv is an eigenvector with eigenvalue λ=2\lambda = 2.

Similarly, [0,1]T[0,1]^T is an eigenvector with eigenvalue λ=3\lambda = 3.


Eigenvalues are found by solving:

det(AλI)=0\det(A - \lambda I) = 0

This yields an nn-degree polynomial in λ\lambda. Each solution is an eigenvalue.


::: code-group

import numpy as np
A = np.array([[2, 0], [0, 3]])
# Eigenvalues and eigenvectors
eigenvalues, eigenvectors = np.linalg.eig(A)
print("Eigenvalues:", eigenvalues)
print("Eigenvectors:\n", eigenvectors)
use ndarray::array;
use ndarray::Array2;
use ndarray_linalg::Eig;
fn main() {
let a: Array2<f64> = array![
[2.0, 0.0],
[0.0, 3.0]
];
// Eigen decomposition
let (eigenvalues, eigenvectors) = a.eig().unwrap();
println!("Eigenvalues: {:?}", eigenvalues);
println!("Eigenvectors:\n{:?}", eigenvectors);
}

:::


  • PCA → uses eigenvectors of covariance matrix to find directions of maximum variance.
  • Spectral clustering → uses eigenvalues of Laplacian matrices.
  • Stability analysis → eigenvalues determine convergence rates of iterative methods.

Continue to Singular Value Decomposition (SVD).