Search code examples
python-3.xscikit-learnclassificationloss-function

Sklearn stop on loss plateau during manual training


Working with sklearn, the fit function of MLPClassifier is a nice one-size-fits-all solution; you call it once, and it trains until it hits the maximum number of iterations, or the training loss plateaus, all without any interaction. However, I had to change my code to accommodate some other features, and the standard fit function isn't configurable enough for what I want to do. I reconfigured my code to use partial_fit instead, manually running each iteration one at a time; but I can't figure out how to get my code to recognize when the loss plateaus, like in the fit function. I can't seem to find any properties or methods of MLPClassifier that allow me to access the loss value calculated by partial_fit, so that I can judge if the loss has plateau'd. It seems to me the only way to judge loss across each iteration would be to calculate it myself, despite the fact that partial_fit already calculates it, and even prints it to the console in verbose mode.

Edit: Running partial_fit manually still does cause the training algorithm to recognize when training loss stops improving; it prints the message Training loss did not improve more than tol=0.000100 for 10 consecutive epochs. Stopping. after each iteration, once the training loss plateaus. However, because i'm controlling the iterations manually, it doesn't actually stop, and I have no way of figuring out in my code whether or not this message has been printed in order to stop it manually.


Solution

  • I would recommend to manually log the loss in a list:

    loss_list = list()
    clf =  MLPClassifier()
    #partial fit and so on
    print(clf.loss_)
    loss_list.append(clf.loss_)
    

    I can provide you with a stopping criterion if this code is helpful.