I have a matrix X
with shape (F,T,M)
. I wish to multiply each (T,M)
matrix along the F
axis so that the answer will be of shape (M,M,F)
. This code does the job for me but this operation repeats many times and it is very slow:
for f in range(F):
output[:,:,f] = np.matmul(X[f,:,:].T,X[f,:,:])
All I could find is np.tensordot() function. If I understand correctly, this is not a good option for me since I need a matrix multiplication and not a dot product.
How do I implement this efficiently using numpy? Is it possible and beneficial to utilize keras\tf for this purpose?
We can use np.matmul/@ opeartor in Python 3.x
after extending dimensions -
np.matmul(X.swapaxes(1,2),X).swapaxes(0,2)
(X.swapaxes(1,2)@X).swapaxes(0,2)
Alternatively, with np.einsum
with a direct translation off the shape variables used for the string notation -
np.einsum('ftm,ftn->mnf',X,X)