Search code examples
pythonscikit-learnrandom-forestdecision-treedepth

How do you access tree depth in Python's scikit-learn?


I'm using scikit-learn to create a Random Forest. However, I want to find the individual depths of each tree. It seems like a simple attribute to have but according to the documentation, (http://scikit-learn.org/stable/modules/generated/sklearn.ensemble.RandomForestClassifier.html) there is no way of accessing it.

If this isn't possible, is there a way of accessing the tree depth from a Decision Tree model?

Any help would be appreciated. Thank you.


Solution

  • Each instance of RandomForestClassifier has an estimators_ attribute, which is a list of DecisionTreeClassifier instances. The documentation shows that an instance of DecisionTreeClassifier has a tree_ attribute, which is an instance of the (undocumented, I believe) Tree class. Some exploration in the interpreter shows that each Tree instance has a max_depth parameter which appears to be what you're looking for -- again, it's undocumented.

    In any case, if forest is your instance of RandomForestClassifier, then:

    >>> [estimator.tree_.max_depth for estimator in forest.estimators_]
    [9, 10, 9, 11, 9, 9, 11, 7, 13, 10]
    

    should do the trick.

    Each estimator also has a get_depth() method than can be used to retrieve the same value with briefer syntax:

    >>> [estimator.get_depth() for estimator in forest.estimators_]
    [9, 10, 9, 11, 9, 9, 11, 7, 13, 10]
    

    To avoid mixup, it should be noted that there is an attribute of each estimator (and not each estimator's tree_) called max depth which returns the setting of the parameter rather than the depth of the actual tree. How estimator.get_depth(), estimator.tree_.max_depth, and estimator.max_depth relate to each other is clarified in the example below:

    from sklearn.datasets import load_iris
    from sklearn.ensemble import RandomForestClassifier
    clf = RandomForestClassifier(n_estimators=3, random_state=4, max_depth=6)
    iris = load_iris()
    clf.fit(iris['data'], iris['target'])
    [(est.get_depth(), est.tree_.max_depth, est.max_depth) for est in clf.estimators_]
    

    Out:

    [(6, 6, 6), (3, 3, 6), (4, 4, 6)]
    

    Setting max depth to the default value None would allow the first tree to expand to depth 7 and the output would be:

    [(7, 7, None), (3, 3, None), (4, 4, None)]