Search code examples
neural-networkpytorchloss-function

Is it possible to train a neural netowrk using a loss function on unseen data (data different from input data)?


Normally, a loss function may be defined as L(y_hat, y) or L(f(X), y), where f is the neural network, X is the input data, y is the target. Is it possible to implement (preferably in PyTorch) a loss function that depends not only on the input data X, but also on X' (X != X)?

For example, let's say I have a neural network f, input data (X,y) and X'. Can I construct a loss function such that

  1. f(X) is as close as possible to y, and also
  2. f(X') > f(X)?

The first part can be easily implemented (PyTorch: nn.MSELoss()), the second part seems to be way harder.

P.S: this question is a reformulation of Multiple regression while avoiding line intersections using neural nets, which was closed. In the original data, input data and photos with a theoretical example are available.


Solution

  • Yes it is possible. For instance, you can add a loss term using ReLU as follows:

    loss = nn.MSELoss()(f(X),y) + lambd * nn.ReLU()(f(X)-f(X'))
    

    where lambd is a hyperparameter. Note that this corresponds to f(X') >= f(X) but it's easily modifiable to f(X') > f(X) by adding an epsilon<0 (small enough in absolute value) constant inside the ReLU.