When I multiply an eigenvector by a matrix, it should result in the same output as multiplying that eigenvector by its corresponding eigenvalue. I am trying to verify that my eigenvectors and eigenvalues are working as advertised, but the outputs don't seem right.
cov_matrix = np.cov(scaled_data)
eig_vals, eig_vecs = np.linalg.eigh(cov_matrix)
a = cov_matrix.dot(eig_vecs[:, 0])
b = eig_vecs[:, 0] * eig_vals[0]
When I print a and b, they are the same shape but their values are all different. What is going wrong here?
Try the following :
import numpy as np
np.random.seed(42) # for reproducibility
A = np.random.random((10,10)) + np.random.random((10,10)) * 1j
eig_vals, eig_vecs = np.linalg.eigh(A)
np.allclose(A @ eig_vecs[:, 0], eig_vals[0] * eig_vecs[:, 0])
>>> False
Keep in mind that np.linalg.eigh return the eigenvalues and eigenvectors of a complex Hermitian (conjugate symmetric) or a real symmetric matrix. So for a hermitian matrix:
A = (A + A.T.conj())/2 # Here A is forced Hermitian now
eig_vals, eig_vecs = np.linalg.eigh(A)
print(np.allclose(A @ eig_vecs[:, 0], eig_vals[0] * eig_vecs[:, 0]))
>>> True
Check prior to diagonalization if cov_matrix
is symmetric with something like np.allclose(cov_matrix, cov_matrix.T.conj())
. If not, you can just use np.linalg.eig.