Search code examples
lightgbm

How to get the binary_logloss for the best iteration of LightGBM?


I am trying to use Optuna to tune the parameters of a LightGBM binary classification model. It requires defining an objective function which reports a metric that Optuna is trying to optimize. I would like to use the default binary_logloss function which LightGBM reports during training anyway:

enter image description here

Here is my code (adjusted from here):

def objective(trial):
    params = {
        'objective': 'binary',
        'learning_rate': trial.suggest_loguniform('learning_rate', 1e-5, 1e-2),
        'num_leaves': trial.suggest_int('num_leaves', 2, 128)
    }
    model = lgb.train(params, dtrain, valid_sets= [dtrain, dvalid], num_boost_round=10000, early_stopping_rounds=200)

    return model.best_score['valid_0']['binary_logloss'] # this does not work

This produces an error "KeyError('binary_logloss')"

So, it seems, that this not how the binary_loss of the best iteration of the LightGBM model can be accessed. How can I access it?


Solution

  • The issue is that 'valid_0' does not seem to be a valid entry (the code snippet in that link seems outdated or wrong). The correct way to access this metric is:

    booster.best_score['training']['binary_logloss']