Eigenwhat?

This demo will help you build intuition for the behavior of eigenvectors and eigenvalues of a 2x2 symmetric real matrix. As you’ll remember, an eigenvector $v$ of a matrix $M$ is any vector that satisfies the following equation:

\[Mv = \lambda v\]

In other words, if you transform a vector and you end up with a scaled version of the vector, then it is an eigenvector. The amount by which the vector is scaled is known as the eigenvalue.

When the 2x2 matrix is symmetric, then there exist two eigenvectors that are orthogonal to each other. In that case, we can write the matrix as:

\[M = U \Sigma U^T\]

where $U$ is a matrix holding the eigenvectors and $\Sigma$ is a diagonal matrix where the entries in the diagonal are the eigenvalues. If you’ll remember from linear algebra, every time you have a square matrix whose rows (or columns) are orthogonal to each other, that is a rotation matrix and, in addition, rotation matrices are such that their transposes are their inverses. So a good way to think about this is that eigenvectors give you a decomposition of the matrix M into simpler matrices.

In other words, the operation of every symmetric matrix $M$ on a vector $v$ is $Mv = U \Sigma U^T v$, or:

In the interactive demo below, the unit-length eigenvectors are represented by the red dots.

Points transformed by a symmetric 2x2 matrix

Enter the values for M here

$M_{00}$ $M_{01}$
$M_{10}$ 1 $M_{11}$

Eigenvectors and eigenvalues

$U_{00}$ $U_{01}$
$U_{10}$ $U_{11}$


$\lambda_0$$\lambda_1$

Known bugs

More reading

If you really want to understand eigenvectors and eigenvalues, the best thing to read continues to be chapter 5 of Shewchuk’s classic Introduction to the Conjugate Gradient Method Without the Agonizing Pain.

The particular presentation in this demo was inspired by Blinn’s also-classic Consider the lowly 2x2 matrix.