Search code examples
pythonpytorchloss-function

Is that normal to train properly with 2 sets and/or 2 diff losses function?


I have 2 train sets: one with label and one with no label.

When training, i’m simultaneously loading one batch from a labelled set, then calculating using the first loss function; and then, one batch from unlabeled set, calculating using the using the other function. Finally I sum them (2 losses) and loss.backward() .

Does this way work ? it’s quite uncommon in my mind so just ask if the engine know how to back-propagate properly (not wrong)? Thank you.


Solution

  • I've got an answer from pytorch forum discussion. Autograd engine in pytorch work well and correctly with 2 or more different loss function so we don't need to worry about its accuracy. Thanks