Skip to content

Spectral Decomposition & Applications

Spectral decomposition is at the heart of many machine learning techniques.
It connects eigenvalues, eigenvectors, and matrix diagonalization to practical applications like PCA and spectral clustering.


1. Spectral Decomposition of Symmetric Matrices

Section titled “1. Spectral Decomposition of Symmetric Matrices”

If AA is a real symmetric matrix (A=ATA = A^T), then it can be diagonalized as:

A=QΛQTA = Q \Lambda Q^T

where:

  • QQ = orthogonal matrix of eigenvectors
  • Λ\Lambda = diagonal matrix of eigenvalues

This is known as the spectral theorem.

::: info Why important?

  • Guarantees orthogonal eigenvectors for symmetric matrices.
  • Forms the basis of PCA (covariance matrices are symmetric).
    :::

For a graph GG with adjacency matrix WW and degree matrix DD, the graph Laplacian is:

L=DWL = D - W
  • LL is symmetric and positive semi-definite.
  • Its eigenvectors capture graph structure.

Spectral Clustering:

  1. Compute Laplacian LL.
  2. Find the kk smallest eigenvectors.
  3. Treat them as features, cluster with k-means.

This uses spectral decomposition to reveal community structure in graphs.


Section titled “3. Links to PCA & Dimensionality Reduction”
  • PCA: Eigen-decomposition of covariance matrix.

    • Σ=1mXTX\Sigma = \frac{1}{m} X^T X (symmetric PSD).
    • Eigenvectors = principal components.
    • Eigenvalues = variance explained.
  • Manifold Learning: Spectral methods generalize PCA to nonlinear settings (e.g., Laplacian eigenmaps).


::: code-group

import numpy as np
from sklearn.cluster import KMeans
# Example symmetric matrix
A = np.array([[2, 1], [1, 2]])
# Spectral decomposition
eigvals, eigvecs = np.linalg.eigh(A)
print("Eigenvalues:", eigvals)
print("Eigenvectors:\n", eigvecs)
# Simple spectral clustering example (tiny graph)
W = np.array([[0, 1, 0],
[1, 0, 1],
[0, 1, 0]])
D = np.diag(W.sum(axis=1))
L = D - W
eigvals, eigvecs = np.linalg.eigh(L)
print("\nGraph Laplacian Eigenvalues:", eigvals)
# Use 2 smallest eigenvectors as features
X_spec = eigvecs[:, :2]
labels = KMeans(n_clusters=2, random_state=42).fit_predict(X_spec)
print("Cluster labels:", labels)
use ndarray::array;
use ndarray_linalg::Eigh;
fn main() {
// Example symmetric matrix
let a = array![[2.0, 1.0],
[1.0, 2.0]];
// Spectral decomposition
let (eigvals, eigvecs) = a.eigh(UPLO::Lower).unwrap();
println!("Eigenvalues: {:?}", eigvals);
println!("Eigenvectors:\n{:?}", eigvecs);
// Graph Laplacian example (tiny graph)
let w = array![
[0.0, 1.0, 0.0],
[1.0, 0.0, 1.0],
[0.0, 1.0, 0.0]
];
let d = w.sum_axis(ndarray::Axis(1)).into_diag();
let l = &d - &w;
let (lap_eigvals, lap_eigvecs) = l.eigh(UPLO::Lower).unwrap();
println!("Graph Laplacian Eigenvalues: {:?}", lap_eigvals);
println!("Eigenvectors:\n{:?}", lap_eigvecs);
}

:::


  • Spectral decomposition diagonalizes symmetric matrices.
  • Graph Laplacians enable spectral clustering.
  • PCA is an application of spectral decomposition to covariance matrices.