Before going deep into the theory, if what you're looking for are practical examples can have them in these two links:
Example 1
Example 2
Given the endomorphism
F : V → V
x∈V → F(x) = Ax∈V
With V n-dimension vector space, n> 1 on Κ (where K contains all root of characteristic polynomial)
We have seen that for every
endomorphism there exists a matrix of dimension nxn such that
F(v) = Av
Two matrix, A and B over K are said to be similar if there exists oher matrix B,
also over the field K such that
B = PAP
-1
Then a matrix A is diagonalizable if it is similar to a diagonal matrix.
The A matrix characteristic polynomial decomposed into k roots
λ
j, each one with multiplicity
m
j. At same time foeach
λ
j exists an autospace
E(λ
j) and such us.
Dim (E(λ
j)) = d
j ≤ m
j and therefore
∑
kj=1d
j ≤ ∑
kj=1m
j = n
Sometimes it is called algebraic multiplicity of eigenvalue
λ
j to m
j,
and geometric multiplicity of eigenvalue λ
j to
d
j.
If each one eigenvalue λ
j generates an eigenspace
E(λ
j) with dimension equal to the (algebraic)
multiplicity of the eigenvalue, that is
m
j = d
jT
then A is a diagonalizable matrix (and also it is said that F is diagonalizable).
Otherwise, the sum of the dimensions of the eigenspace generated by each
one eigenvalue is less than n, ie, the basis of eigenvectors does not fill
all the space V. In this case, the matrix is not diagonalizable
but it is possible to find a basis where the matrix V is expressed in a
form called
Jordan Cannonical Form
Lets be the V a vector space of dimension n over K in the form
F : V → V
x∈V → F(x) = Ax∈V
And let A be the mxm matrix associated with this endomorphism.
Sea
P(λ) = (1-λ
1)
k1 . (1-λ
2)
k2 ... (1-λ
k)
kk
The characteristic polynomial of matrix A, with λ
j ∈K eigenvalues of A ∀j=1,2,...,k.
Each eigenvalue corresponds to an eigenspace, subspace of V given by
E(λ
j) = {v ∈ V tal que F(v) = λ
jv} =
{v ∈ V tal que (F − λ
jId)(v) = 0} = Ker(F − λ
jId).
Each personal space or eigenspace is generated by a basis vector associated with the eigenvalue λ
j. Thus each basis vector of the eigenspace call
B
j = {v
1, v
2, ..., v
m}
In general the dimension of each eigenspace is less than the multiplicity of each eigenvalue, ie Dim(E(λ
j)) ≤ m
j
However, if A is diagonalizable the dimension of each eigenspace are equaly to multiplicity of each eigenvalue, as we see it in following theorem.
We have the following theorem as a criterion for matrix diagonalization
Theorem 1 (necessary and sufficient condition for A to be Digagonalizable)
A is diagonalizable if and only if it is hold
1) m1 + m1 + ... + mk = n.
2) For each eigenvalue j with multiplicity mj we have that the corresponding eigenspace dimension is mj, ie
Dim(E(λj) ) = mj
And we have the following corollaries
Corollary 1
If the matrix A is diagonalizable then V is the direct sum of the eigenspace of A, ie
V = E(λ1) ⊕ E(λ2) ⊕ ... ⊕ E(λk)
Corollary 2
A matrix A is diagonalizable if and only if there is a basis of V consisting of eigenvectors of A
Therefore we can give these sufficient conditions for the diagonalization
Theorem 2 (Sufficient conditions for A to be diagonalizable)
1) If the characteristic polynomial has n distinct roots in field K then matrix A is diagonalizable .
2) If the characteristic polynomial has k roots, and eigenspace corresponding to each one has dimension equal to its multiplicity then matrix A is diagonalizable.
3) If 1) or 2) does not holds then A is not diagonalizable
Thus, if we are in case 3) of the previous theorem, the matrix A is not diagonalizable.
The case of symmetric matrices, the situation is simpler since all its eigenvalues are real, and eigenvectors corresponding to distinct eigenvalues are orthogonal, i want to remaind now that a matrix is symmetric if it equals its transpose, ie
A is symmetric <=> A = At
Theorem 2 (Diagonalization of symmetric matrix)
If A is a symmetric matrix, then we have:
1) All its eigenvalues are real
2) A is diagonalizable.
3) The eigenvectors corresponding to distinct eigenvalues are orthogonal.