Search code examples
pytorch

Pytorch tensor and it's transpose have different storage


I was reading the book Deep Learning with Pytorch and was trying out an example which shows that a tensor and it's transpose share the same storage.

However, when i tried it out on my local machine, I can see that the storage is different for both. Just wanted to understand why this might be the case here ?

The code i tried and the output is as below:

>>> points = torch.tensor([[4.0, 1.0], [5.0, 3.0], [2.0, 1.0]])
>>> points_t = torch.transpose(points,0,1)
>>> points_t
tensor([[4., 5., 2.],
        [1., 3., 1.]])
>>> id(points.storage())==id(points_t.storage())
False
>>> id(points.storage())
2796700202176
>>> id(points_t.storage())
2796700201888

My python version is 3.9.7 and pytorch version is 1.11.0


Solution

  • You need to compare the pointer of storages instead of taking the id of it.

    >>> points = torch.tensor([[4.0, 1.0], [5.0, 3.0], [2.0, 1.0]])
    >>> points_t = torch.transpose(points,0,1)
    >>> points_t
    tensor([[4., 5., 2.],
            [1., 3., 1.]])
    >>> points.storage().data_ptr() == points_t.storage().data_ptr()
    True
    

    The reason you are getting False for id comparison is that Python objects (points and points_t) are different objects but the underlying storages (the memory that you allocate to keep the data) are the same.