I want to diagonalize a matrix and then be able to do basis changes. The aim in the end is to do matrix exponentiation, with exp(A) = P.exp(D).P^{-1}
.
I use sgeev
to diagonalize A
. If I am not mistaken (and I probably am since it's not working), sgeev
gives me P
in the vr
matrix and P^{-1}
is transpose(vl)
. The diagonal matrix can be reconstitute from the eigenvalues wr
.
The problem is that when I try to verify the matrix transformation by computing P * D * P^{-1}
it's not giving A back.
Here's my code:
integer :: i,n, info
real::norm
real, allocatable:: A(:,:), B(:,:), C(:,:),D(:,:)
real, allocatable:: wr(:), wi(:), vl(:, :), vr(:, :), work(:)
n=3
allocate(vr(n,n), vl(n,n), wr(n), wi(n), work(4*n))
allocate(A(n,n),B(n,n), C(n,n),D(n,n))
A(1,:)=(/1,0,1/)
A(2,:)=(/0,2,1/)
A(3,:)=(/0,3,1/)
call sgeev('V','V',n,A,n,wr,wi,vl,n,vr,n,work,size(work,1),info)
print*,'eigenvalues'
do i=1,n
print*,i,wr(i),wi(i)
enddo
D=0.0
D(1,1)=wr(1)
D(2,2)=wr(2)
D(3,3)=wr(3)
C = matmul(D,transpose(vl))
B = matmul(vr,C)
print*,'A'
do i=1, n
print*, B(i,:)
enddo
The printed result is:
eigenvalues
1 1.00000000 0.00000000
2 3.30277562 0.00000000
3 -0.302775621 0.00000000
A
0.688247263 0.160159975 0.764021933
0.00000000 1.66581571 0.817408621
0.00000000 2.45222616 0.848407149
A is not the original A, not even considering an eventual factor.
I guess I am somehow mistaken since I checked the eigenvectors by computing matmul(A,vr) = matmul(vr,D)
and matmul(transpose(vl),A) = matmul(D, transpose(vl))
, and it worked.
Where am I wrong?
The problem is that transpose(vl)
is not the inverse of vr
. The normalisation given by sgeev
is that each eigenvector (each column of vl
or vr
) is individually normalised. This means that dot_product(vl(:,i), vr(:,j))
is zero if i/=j
, but is in general <1
if i==j
.
If you want to get P^{-1}
, you need to scale each column of vl
by a factor of 1/dot_product(vl(:,i),vr(:,i)
before transposing it.