For example, I have a list of N
B x H
tensor(i.e. a N x B x H
tensor) and a list of N
vectors (i.e. N x B
tensor). And I want multiply each B x H
tensor in the list with corresponding B
dimensional tensor, resulting a N x H
tensor.
I know how to use a single for-loop
with PyTorch to implement the computation, but is there any vectorised implantation? (i.e. no for-loop
, just using PyTorch/numpy operations)
You could achieve this with torch.bmm()
and some torch.squeeze()
/torch.unsqueeze()
.
I am personally rather fond of the more generictorch.einsum()
(which I find more readable):
import torch
import numpy as np
A = torch.from_numpy(np.array([[[1, 10, 100], [2, 20, 200], [3, 30, 300]],
[[4, 40, 400], [5, 50, 500], [6, 60, 600]]]))
B = torch.from_numpy(np.array([[ 1, 2, 3],
[-1, -2, -3]]))
AB = torch.einsum("nbh,nb->nh", (A, B))
print(AB)
# tensor([[ 14, 140, 1400],
# [ -32, -320, -3200]])