# Matrix Theory: An essential proof for eigenvector computations

I’ve avoided proofs unless absolutely necessary, but the relation between the same eigenvector expressed in two different bases, is important.
Given that AS is the linear transformation matrix in standard basis S, and AB is its counterpart in basis B, we can write the relation between them as:

$A_B = C^{-1}A_SC\\* A_S = CA_BC^{-1}$

where C is the similarity transformation. We’ve seen this relation already; check here if you’ve forgotten about it.

Now, the definition of an eigenvector gives us:

$A_Sx_S=\lambda x_S$

x is the eigenvector in standard basis S. Now substituting AS with the identity we obtained previously into the eigenvector identity, we get:

$CA_BC^{-1}x_S= \lambda x_S\\* A_B\left(C^{-1}x\right) = \lambda\left(C^{-1}x_S\right)$

If we take xB = C-1.xS, we get:

$A_Bx_B = \lambda x_B$

This tells us two important things. One, that the eigenvalue is still $\lambda$. Second, the eigenvector expressions in the two bases are related as:

$x_B = C^{-1}.x_S$

This gives us the relation we need, and justifies the logic in this post.