jthomas.site// notebook · v.4.2026
Machine Learning, Visualized · Vol. II

Eigenvalues
& Eigenvectors

A linear transformation reshapes the plane. Most vectors twist and turn under it — but a precious few keep pointing the same way, only stretched. Those are eigenvectors, and the amount they stretch is the eigenvalue.

The concept

An eigenvector of a matrix A is a direction that A leaves alone — the matrix stretches it, but never rotates it.

The amount of stretch is the eigenvalue λ, defined by Av = λv. A 2×2 matrix has at most two such directions (and sometimes none at all, if the matrix is essentially a rotation). They are the matrix's natural axes — the skeleton on which everything else hangs.

If you apply the matrix to any starting vector over and over and re-normalize, you converge to the eigenvector with the biggest |λ|. That move — power iteration — is the engine behind a surprising amount of machine learning.

Why ML cares

Google's original PageRank ranked the web by computing the dominant eigenvector of a giant link-matrix. Spectral clustering finds groups using eigenvectors of a graph Laplacian. Diffusion maps, Laplacian eigenmaps, normal-mode analysis — all eigen-things.

Whenever a method "finds the dominant direction" or "decomposes into independent components," the underlying engine is almost always an eigenvector calculation. They are the most-reused primitive in unsupervised learning.

Try this
  1. Click Symmetric — the two accent arrows meet at a right angle. Symmetric matrices always have orthogonal eigenvectors.
  2. Click Rotation — the eigenvectors disappear. A pure rotation has no real eigenvectors, because every direction changes.
  3. Click PageRank-ish, switch to III. Iterate, hit Play. Watch every starting vector fall onto a single line — the dominant eigenvector.
Where you've seen this 04 examples
Google's PageRank, 1998

When Google launched, it sorted the web by computing the dominant eigenvector of a link-matrix with billions of pages. The intuition: the web is a Markov chain of clicks; the page everyone keeps coming back to has the largest steady-state probability. You watched this happen in mode III above.

Spotify's "Discover Weekly"

Recommender systems factorize a (users × songs) matrix to find latent music-taste dimensions. Those dimensions are eigenvectors of the data — not "rock vs jazz" but abstract directions discovered from the listening data itself. New songs get scored against your profile.

Skyscrapers and bridge collapses

Engineers compute eigenvalues of a building's stiffness matrix to find its natural vibration modes — the frequencies at which it will sway. The eigenvectors are the shapes of those sways. The 1940 Tacoma Narrows bridge collapse happened because wind drove the bridge at one of its eigen-frequencies.

Quantum mechanics

In quantum systems, observable measurements (energy, momentum, spin) are eigenvalues of an operator; the states the system can be measured in are the eigenvectors. Schrödinger's equation is, ultimately, an eigenvalue problem.

Further reading