I have a matrix called vec with two columns, vec[:,0] and vec[:,1]. P contains two matrices, P[0,:,:] and P[1,:,:]. I want to mulitiply P[0,:,:] with the first column of vec and multiply P[1,:,:] with the second column of vec. However, the operation P@vec also gives me the matrix product of P[0,:,:] with the second column of vec and the matrix product of P[1,:,:] with the first column of vec, which slows my code.
Is it possible to directly compute the pairs column 1 to matrix 1 and column 2 to matrix 2 without the "off" products?
import numpy as np
P=np.arange(50).reshape(2, 5, 5)
vec=np.arange(10).reshape(5,2)
have=P@vec
want=np.column_stack((have[0,:,0],have[1,:,1]))
have,want
There is a very powerful function in numpy called np.einsum
. It can perform all kind of tensor contractions, axis reordering and matrix multiplication. For your example you could use
res = np.einsum('nij,jn->in', P, vec)
after which res
is exactly like want
.
How does this work:
You give the np.einsum
function both your arrays as well as a signature (that 'nij,jn->in'
string) that tells the function how to multiply the arrays. In short, you want the third axis of the P
tensor to be contracted with the first axis of vec
. Therefore you choose the same index j
in the signature string and leave it out in the part after the ->
. A mere broadcast is done if indices appear on the left and right hand side of the ->
, which is done here for the n
and i
indices.
A more complete explanation of this very powerful function with many examples of how to use it can be found at the corresponding numpy documentation.