Tag Archives: similarity transformations

Matrix Theory: An essential proof for eigenvector computations

I’ve avoided proofs unless absolutely necessary, but the relation between the same eigenvector expressed in two different bases, is important.
Given that AS is the linear transformation matrix in standard basis S, and AB is its counterpart in basis B, we can write the relation between them as:

A_B = C^{-1}A_SC\\*  A_S = CA_BC^{-1}

where C is the similarity transformation. We’ve seen this relation already; check here if you’ve forgotten about it.
Continue reading Matrix Theory: An essential proof for eigenvector computations

Matrix Theory: Diagonalisation and Eigenvector Computation

I return to the first example about basis vectors, when I spoke of linear transformations. The linear transformation we had was this:

  A=  \left( \begin{array}{cc}  2 & 0 \\  0 & 5 \end{array} \right)\

The operation it performed on basis vectors of the standard basis S was one of scaling, and scaling only. When operated on by a linear transformation matrix, if a vector is only scaled as a consequence, that vector is an eigenvector of the matrix, and the scalar is the corresponding eigenvalue. This is just the definition of an eigenvector, which I rewrite below:

A.x = \lambda x

Continue reading Matrix Theory: Diagonalisation and Eigenvector Computation

Matrix Theory: Basis change and Similarity transformations

Basis Change

Understand that there is nothing extremely special about the standard basis vectors [1,0] and [0,1]. All 2D vectors may be represented as linear combinations of these vectors. Thus, the vector [7,24] may be written as:

  \left( \begin{array}{cccc}  7 \\  24 \end{array} \right)\  =  7.  \left( \begin{array}{cccc}  1 \\  0 \end{array} \right)\  +  24.  \left( \begin{array}{cccc}  0 \\  1 \end{array} \right)\

Continue reading Matrix Theory: Basis change and Similarity transformations