Search code examples
pythontensorflowloss

Does the loss in tensorflow.train.AdamOptimizer have to be positive?


My question is quite a simple one: My intention is to minimize loss=a-b where loss is ideally as big a negative number as possible, ie. b much bigger than a.

Since in all the examples, the loss is positive, I wanted to ask whether I can just input my loss into compute_gradients and get the desired result. Cheers


Solution

  • Yes you can. As long as it is a minimization instead of maximization, everything should be exactly the same as in the examples.