Search code examples
pythondeep-learningcntk

CNTK: Define a custom loss function (Sørensen-Dice coefficient)


I'd like to use Wiki: Sørensen–Dice coefficient as a loss function in CNTK/Python. How can I define a custom loss function.


Solution

  • To answer your more general question "How can I define a custom loss function:"

    In CNTK, loss functions are not special. Any expression that results in a scalar can be used as a loss function. The learner will compute the minibatch-level loss by summing up the scalar loss values of all samples in the minibatch, and backpropagate through it like through any CNTK expression.

    For example, the following is a way of defining a square-error loss:

    def my_square_error(x,y):
        diff = x-y
        return times_transpose(diff, diff)
    

    and the cross_entropy_with_softmax() loss can be written in Python like this:

    def my_cross_entropy_with_softmax(output, labels):
        logZ = reduce_log_sum(output)  # log of softmax denominator
        return times_transpose(labels, output) - logZ
    

    Lastly, multi-task learning can be trivially realized by using a loss function that is a weighted sum over multiple losses.