I created an cartesian grid of GBMs using h2o
package in R and saved cross-validation metrics for each model in a data frame. So, for each model, I stored the results given in model@model$cross_validation_metrics_summary
.
What is the threshold used to calculate F1 and F2 scores, precision, recall and specificity in model@model$cross_validation_metrics_summary
? Is there a default value?
you can get the threshold for a specific metric by using:
find_threshold_by_max_metric
api documentation is here