Search code examples
pythongpuxgboost

How do I free all memory on GPU in XGBoost?


Here is my code:

clf = xgb.XGBClassifier(
  tree_method = 'gpu_hist',
  gpu_id = 0,
  n_gpus = 4,
  random_state = 55,
  n_jobs = -1
)
clf.set_params(**params)
clf.fit(X_train, y_train, **fit_params)

I've read the answers on this question and this git issue but neither worked.

I tried to delete the booster in this way:

clf._Booster.__del__()
gc.collect()

It deletes the booster but doesn't completely free up GPU memory.

I guess it's Dmatrix that is still there but I am not sure.

How can I free the whole memory?


Solution

  • Well, I don't think there is a way that you can have access to the loaded Dmatrix cause the fit function doesn't return it. you can check the source code here on this github link:

    So I think the best way is to wrap it in a Process and run it that way, like this:

    from multiprocessing import Process
    
    def fitting(args):
        clf = xgb.XGBClassifier(tree_method = 'gpu_hist',gpu_id = 0,n_gpus = 4, random_state = 55,n_jobs = -1)
        clf.set_params(**params)
        clf.fit(X_train, y_train, **fit_params)
    
        #save the model here on the disk
    
    fitting_process = Process(target=fitting, args=(args))
    fitting process.start()
    fitting_process.join()
    
    # load the model from the disk here