Search code examples
pythonmachine-learninglightgbm

Why is LightGBM outputing this: Finished loading model, total used X iterations?


I am new to the LightGBM model architecture.

Finished loading model, total used 15000 iterations

It prints this message every time I train the model. It is being printed by this line and I would like to understand why it is loading this model. What is the booster model? How is it trained? And why is it loading this model instead of training a new one?

Here's my code:

lgbparams = {...}
lgbtrain = lgb.Dataset(data=self.train_x, label=self.train_y)

self.model = lgb.train(
  lgb_params, lgbtrain,
  evals_result=self.evals_result,
  valid_sets=lgbtrain,
  verbose_eval=False,
  callbacks=self.callbacks
)

Found where this line is being printed. Later on the code after training there is a line copy.deepcopy(self.model) and because of that somehow the LightGBM library prints the line in question.


Solution

  • I've also seen this print and it is during deepcopy. To reproduce it:

    import copy
    import numpy as np
    import lightgbm as lgb
    
    X = np.random.rand(100,3)
    y = np.random.rand(100)
    train = lgb.Dataset(X, y)
    model = lgb.train({"verbose": -1}, train, num_boost_round=1)
    
    # the printout is here:
    model2 = copy.deepcopy(model)
    

    The quick fix to switch off print during copy can be:

    import os
    import contextlib
    
    with open(os.devnull, "w") as f, contextlib.redirect_stdout(f):
        model2 = copy.deepcopy(model)
    

    (I got this printout because I directly call deepcopy on lightgbm object.)