Search code examples
pythonpytorch

Pytorch how to stack tensor like for loop


I want to concat tensor generated in for loop, and get 2dTensor.
standard python, like below.

li = []
for i in range(0, len(items)):
    # calc something
    li.append(calc_result)

In my case, in for loop, generate torch.Size([768]) Tensor, and I want to get torch.Size([len(item),768]) Tensor.
How to do this?


Solution

  • You can use torch.stack:

    torch.stack(li, dim=0)
    

    after the for loop will give you a torch.Tensor of that size.

    Note that if you know in advance the size of the final tensor, you can allocate an empty tensor beforehand and fill it in the for loop:

    x = torch.empty(size=(len(items), 768))
    for i in range(len(items)):
        x[i] = calc_result
    

    This is usually faster than doing the stack.