Search code examples
machine-learninghyperparametersautomlhyperopt

Hyperopt Change Values of Trials() Object maually; Warm Start Hyperopt


I search for a possibility to Warmstart Hyperopt. One way would be to manually fill the List Trials.trials with hyperameters This is acutally possible, yet i wonder if this really influences the optimization, or if this Trials.trials is just the visable Part of the Trials Object, and Hyperopt.


Solution

  • The trials.trials list does not contain all information !! one also has to change the trials._dynamic_trials because of the resfresh function in baye.py, which updates the data from trials.trials with the data from trials._dynamic_trials

    In general warmstarting should be possible. I created a fake trials object in the size of my warmstart states by calling fmin on a fresh trials object with some arbitrary search space and objective function. After this the trials object may be changed by iterating over the lenght of trials.trials and setting the values like so:

    list_of_coldstart_dict = [one_possible_and_evaluation,second_possible_and_evaluation,...]
    fake_space = {
    'test': 2-hp.loguniform('test_02',0.001, 0.1)
    }
    def Objective(params):
        return {"loss":0, 'status': STATUS_OK}
    trials = Trials()
    fmin(Objective,fake_space,
        algo=partial(tpe.suggest, n_startup_jobs=len(list_of_coldstart_dict)), max_evals=len(list_of_coldstart_dicts), 
            trials=new_trials,verbose=1)
    
    for in in range(len(trials.trials):
        trials.trials[i] = list_of_coldstart_dict[i]
        trials._dynamic_trials[i] = list_of_coldstart_dict[i]
        trials.results[i] = trials.trials[i]['result']
    

    beware to maintain the neccecary structure of the dict of dicts inside trials.trials[i]