Search code examples
luatorch

Torch tensors swapping dimensions


I came across these two lines (back-to-back) of code in a torch project:

 im4[{1,{},{}}] = im3[{3,{},{}}]
 im4[{3,{},{}}] = im3[{1,{},{}}]

What do these two lines do? I assumed they did some sort of swapping.


Solution

  • This is covered in indexing in the Torch Tensor Documentation

    Indexing using the empty table {} is shorthand for all indices in that dimension. Below is a demo which uses {} to copy an entire row from one matrix to another:

    > a = torch.Tensor(3, 3):fill(0)
         0 0 0
         0 0 0
         0 0 0
    
    > b = torch.Tensor(3, 3)
    > for i=1,3 do for j=1,3 do b[i][j] = (i - 1) * 3 + j end end
    > b
         1 2 3
         4 5 6
         7 8 9
    
    > a[{1, {}}] = b[{3, {}}]
    > a
        7 8 9
        0 0 0
        0 0 0
    

    This assignment is equivalent to: a[1] = b[3].

    Your example is similar:

     im4[{1,{},{}}] = im3[{3,{},{}}]
     im4[{3,{},{}}] = im3[{1,{},{}}]
    

    which is more clearly stated as:

     im4[1] = im3[3]
     im4[3] = im3[1]
    

    The first line assigns the values from im3's third row (a 2D sub-matrix) to im4's first row and the second line assigns the first row of im3 to the third row of im4.

    Note that this is not a swap, as im3 is never written and im4 is never read from.