# Matrix Theory: An essential proof for eigenvector computations

I’ve avoided proofs unless absolutely necessary, but the relation between the same eigenvector expressed in two different bases, is important.
Given that AS is the linear transformation matrix in standard basis S, and AB is its counterpart in basis B, we can write the relation between them as:

$A_B = C^{-1}A_SC\\* A_S = CA_BC^{-1}$

where C is the similarity transformation. We’ve seen this relation already; check here if you’ve forgotten about it.

# Matrix Theory: Diagonalisation and Eigenvector Computation

$A= \left( \begin{array}{cc} 2 & 0 \\ 0 & 5 \end{array} \right)\$

The operation it performed on basis vectors of the standard basis S was one of scaling, and scaling only. When operated on by a linear transformation matrix, if a vector is only scaled as a consequence, that vector is an eigenvector of the matrix, and the scalar is the corresponding eigenvalue. This is just the definition of an eigenvector, which I rewrite below:

$A.x = \lambda x$

# Matrix Theory: Basis change and Similarity transformations

### Basis Change

Understand that there is nothing extremely special about the standard basis vectors [1,0] and [0,1]. All 2D vectors may be represented as linear combinations of these vectors. Thus, the vector [7,24] may be written as:

$\left( \begin{array}{cccc} 7 \\ 24 \end{array} \right)\ = 7. \left( \begin{array}{cccc} 1 \\ 0 \end{array} \right)\ + 24. \left( \begin{array}{cccc} 0 \\ 1 \end{array} \right)\$

# Matrix Theory: Linear transformations and Basis vectors

### Symmetric Matrices

A symmetric matrix looks like this:

$A= \left( \begin{array}{cccc} a & d & n & w \\ d & b & h & e \\ n & h & c & i \\ w & e & i & d \end{array} \right)\$

Notice how the values are reflected across the diagonal a-b-c-d; this holds true for any symmetric matrix.