I would like to do something similar to np.clip on PyTorch tensors on a 2D array. More specifically, I would like to clip each column in a specific range of value (column-dependent). For example, in numpy, you could do:
x = np.array([-1,10,3])
low = np.array([0,0,1])
high = np.array([2,5,4])
clipped_x = np.clip(x, low, high)
clipped_x == np.array([0,5,3]) # True
I found torch.clamp, but unfortunately it does not support multidimensional bounds (only one scalar value for the entire tensor). Is there a "neat" way to extend that function to my case?
Thanks!
Not as neat as np.clip
, but you can use torch.max
and torch.min
:
In [1]: x
Out[1]:
tensor([[0.9752, 0.5587, 0.0972],
[0.9534, 0.2731, 0.6953]])
Setting the lower and upper bound per column
l = torch.tensor([[0.2, 0.3, 0.]])
u = torch.tensor([[0.8, 1., 0.65]])
Note that the lower bound l
and upper bound u
are 1-by-3 tensors (2D with singleton dimension). We need these dimensions for l
and u
to be broadcastable to the shape of x
.
Now we can clip using min
and max
:
clipped_x = torch.max(torch.min(x, u), l)
Resulting with
tensor([[0.8000, 0.5587, 0.0972],
[0.8000, 0.3000, 0.6500]])