Search code examples
lightgbm

How to get the metrics of a LightGBM booster?


Here is my code:

params = {
    "objective": "binary",
    "seed": 0
}

dtrain = lgb.Dataset(data=X_train, label=y_train, feature_name = X_train.columns.tolist(), categorical_feature = cat_features, free_raw_data=False)
dvalid = lgb.Dataset(data=X_test, label=y_test, feature_name = X_test.columns.tolist(), categorical_feature = cat_features, free_raw_data=False)

lgb_class = lgb.train(
    params=params,
    train_set=dtrain,
    num_boost_round=10000,
    valid_sets = [dtrain, dvalid],
    early_stopping_rounds=200
)

During training LightGBM reports the binary_logloss for both the training data and the validation data. How can I access this metric from the booster? I am particularly interested in the metrics related to the best training iteration.


Solution

  • It is a trivial thing, actually:

    lgb_class.best_score['training']['binary_logloss']