I have 2 arrays currently with shapes v1=(3000,3) and v2=(3,2,3000). The 3000 is a time dimension so v1 has 3000 (1,3) samples and v2 has 3000 (3,2) samples. I wish to do matrix multiplication and broadcast along the 3000 dimension so that I get 3000 (1,2) vectors in return.
I have tried reshaping so that v1 = (1,3,3000) and v2 = (3,2,300) which gives an error saying that the shapes are not aligned.
code:
v1 = np.ones((1,3,3000)) +1
v2 = np.ones((3,2,3000)) - 0.5
np.dot(v1,v2)
With v1
of shape (3000,3)
and v2
as (3,2,3000)
, we can use np.einsum
-
np.einsum('ij,jki->ik',v1,v2)
This gives us an output of shape (3000,2)
.
We could play around with the optimize
arg in np.einsum
. With optimize = True
, it leverages BLAS
internally and with optimize = False
resorts to simple C-loops. That BLAS
way requires some setting up work too. So, with decent lengths of axes that undergo sum-reductions, we might want to set that flag as True
and False
otherwise. In this case, it seems those axes are really short, so we are probably better off with the default : optimize = False
input.