Search code examples
python-3.xh2oautoml

Is it possible to get a feature importance plot from a h2o.automl model?


I have a binary classification problem, and I am using "h2o.automl" to obtain a model.

Is it possible to obtain a plot of the importances of my dataset features from the "h2o.automl" model?

A pointer to some python 3 code would be much appreciated.

Thanks. Charles


Solution

  • It depends on which model you are using. If you use the top model on the AutoML Leaderboard, that will probably be a Stacked Ensemble and we do not yet have a function to extract feature importance for that type of model yet (though there is a ticket open to add this).

    If you want to use any other type of model (e.g. GBM), then you can use the regular way of getting variable importance from an H2O model. Here's a demo using the example code from the H2O AutoML User Guide.

    import h2o
    from h2o.automl import H2OAutoML
    
    h2o.init()
    
    # Import a sample binary outcome training set into H2O
    train = h2o.import_file("https://s3.amazonaws.com/erin-data/higgs/higgs_train_10k.csv")
    
    # Identify predictors and response
    x = train.columns
    y = "response"
    x.remove(y)
    
    # For binary classification, response should be a factor
    train[y] = train[y].asfactor()
    
    # Run AutoML for 10 models
    aml = H2OAutoML(max_models=10, seed=1)
    aml.train(x=x, y=y, training_frame=train)
    
    # View the AutoML Leaderboard
    lb = aml.leaderboard
    lb
    

    The top two models are Stacked Ensembles, but the third is a GBM, so we can extract variable importance from that model.

    In [6]: lb[:5,"model_id"]
    
    Out[6]:
    model_id
    -----------------------------------------------------
    StackedEnsemble_AllModels_0_AutoML_20180801_120024
    StackedEnsemble_BestOfFamily_0_AutoML_20180801_120024
    GBM_grid_0_AutoML_20180801_120024_model_4
    GBM_grid_0_AutoML_20180801_120024_model_0
    GBM_grid_0_AutoML_20180801_120024_model_1
    
    [5 rows x 1 column]
    

    Here's how to grab the variable importance. First grab the GBM model object:

    # Get third model
    m = h2o.get_model(lb[2,"model_id"])
    

    Then you can get the data back in a Pandas DataFrame (if you have pandas installed) as follows:

    In [13]: m.varimp(use_pandas=True)
    Out[13]:
       variable  relative_importance  scaled_importance  percentage
    0       x26           997.396362           1.000000    0.224285
    1       x28           437.546936           0.438689    0.098391
    2       x27           338.475555           0.339359    0.076113
    3        x6           306.173553           0.306973    0.068849
    4       x25           295.848785           0.296621    0.066528
    5       x23           284.468292           0.285211    0.063968
    6        x1           191.988358           0.192490    0.043172
    7        x4           184.072052           0.184553    0.041392
    8       x10           137.810501           0.138170    0.030989
    9       x14           100.928482           0.101192    0.022696
    10      x12            90.265976           0.090502    0.020298
    11      x22            89.900856           0.090136    0.020216
    12      x20            87.367523           0.087596    0.019646
    13      x19            83.130775           0.083348    0.018694
    14       x5            82.661133           0.082877    0.018588
    15      x16            81.957863           0.082172    0.018430
    16      x18            80.794426           0.081005    0.018168
    17       x7            80.664566           0.080875    0.018139
    18      x11            75.841171           0.076039    0.017054
    19       x2            75.037476           0.075233    0.016874
    20       x8            72.234459           0.072423    0.016243
    21      x15            70.233994           0.070417    0.015794
    22       x3            60.015785           0.060172    0.013496
    23       x9            40.281757           0.040387    0.009058
    24      x13            35.475540           0.035568    0.007977
    25      x17            25.367661           0.025434    0.005704
    26      x24            22.506416           0.022565    0.005061
    27      x21            18.564632           0.018613    0.004175
    

    You can also plot the variable importance using m.varimp_plot() if you have matplotlib installed.