The eigenvectors of a square matrix are the non-zero vectors which, after being multiplied by the matrix, remain proportional to the original vector, i.e. any vector that satisfies the equation:
where is the matrix in question, is the eigenvector and is the associated eigenvalue.
As will become clear later on, eigenvectors are not unique in the sense that any eigenvector can be multiplied by a constant to form another eigenvector. For each eigenvector there is only one associated eigenvalue, however.
If you consider a matrix as a stretching, shearing or reflection transformation of the plane, you can see that the eigenvalues are the lines passing through the origin that are left unchanged by the transformation1.
Note that square matrices of any size, not just matrices, can have eigenvectors and eigenvalues.
In order to find the eigenvectors of a matrix we must start by finding the eigenvalues. To do this we take everything over to the LHS of the equation:
then we pull the vector outside of a set of brackets:
The only way this can be solved is if does not have an inverse2, therefore we find values of such that the determinant of is zero:
Once we have a set of eigenvalues we can substitute them back into the original equation to find the eigenvectors. As always, the procedure becomes clearer when we try some examples:
Example 1
Q)
Find the eigenvalues and eigenvectors of the matrix:
A)
First we start by finding the eigenvalues, using the equation derived above:
If you like, just consider this step as, “subtract from each diagonal element of the matrix in the question”.
Next we derive a formula for the determinant, which must equal zero:
We now need to find the roots of this quadratic equation in .
In this case the quadratic factorises straightforwardly to:
The solutions to this equation are and . These are the eigenvalues of