I'm trying to insert tensors of different dimensions into a lua table. But the insertion is writing the last tensor to all the previous elements in the table.
MWE:
require 'nn';
char = nn.LookupTable(100,10,0,1)
charRep = nn.Sequential():add(char):add(nn.Squeeze())
c = {}
c[1] = torch.IntTensor(5):random(1,100)
c[2] = torch.IntTensor(2):random(1,100)
c[3] = torch.IntTensor(3):random(1,100)
--This works fine
print(c)
charFeatures = {}
for i=1,3 do
charFeatures[i] = charRep:forward(c[i])
--table.insert(charFeatures, charRep:forward(c[i]))
-- No difference when table.insert is used
end
--This fails
print(charFeatures)
Maybe i haven't understood how tables work in Lua. But this code copies the last tensor to all previous charFeatures
elements.
The issue is not related with tables, but is very common in Torch though. When you call the forward
method on a neural net, its state value output
is changed. Now when you save this value into charFeatures[i]
you actually create a reference from charFeatures[i]
to charRep.output
. Then in the next iteration of the loop charRep.output
is modified and consequently all the elements of charFeatures
are modified too, since they point to the same value which is charRep.output
.
Note that this behavior is the same as when you do
a = torch.Tensor(5):zero()
b = a
a[1] = 0
-- then b is also modified
Finally to solve your problem you should clone the output of the network:
charFeatures[i] = charRep:forward(c[i]):clone()
And all will work as expected!