Search code examples
pythonpytorchtensor

Pytorch looses precision when converting numbers into tensors


How come PyTorch makes such a weird mistake in higher digits in division?

a = torch.tensor(1.0/10.0)`

print("{:.10f}, {:.10f}, {:.10f}".format(1.0/10.0, torch.tensor(1.0/10.0), a))

Output:

0.1000000000, **0.1000000015**, **0.1000000015**

My Python version is 3.12.4 and Torch is 2.3.0.post100.


Solution

  • This is a numerical precision error. Using 64 bit floats matches python.

    a = 1.0 / 10.0
    b = torch.tensor(1.0/10.0, dtype=torch.float32)
    c = torch.tensor(1.0/10.0, dtype=torch.float64)
    
    print(f"{a:.10f}, {b.item():.10f}, {c.item():.10f}")
    > 0.1000000000, 0.1000000015, 0.1000000000
    
    print(a == b.item(), a == c.item())
    False True