Search code examples
tensorflowkerasconstraints

Tensorflow clip_by_value versus Keras NonNeg


I want to constrain a Tensorflow Variable to be non-negative via the constraint keyword argument. Should I use clip_by_value from base tensorflow or NonNeg from Keras? Here is my implementation of the clip_by_value constraint:

def make_nonneg(x):
    return clip_by_value(x,0.0,np.inf)

Also, do these both work equally well if I end up using the variable in code wrapped in a @tf.function call?


Solution

  • It is a matter of taste and depends on the way you use it. If you define a model using Keras layers and want to put a constraint during training, it is very convenient to use Keras constraints:

    model.add(Dense(64, kernel_constraint=NonNeg()))
    

    But what it effectively does it pretty similar to clip_by_value:

      def __call__(self, w):
        return w * math_ops.cast(math_ops.greater_equal(w, 0.), K.floatx())
    

    See https://github.com/tensorflow/tensorflow/blob/v2.4.1/tensorflow/python/keras/constraints.py#L93

    So, the NonNeg constraint is just a callable object wrapper.