I've used Scikit-learn's GridSearchCV before to optimize the hyperparameters of my models, but just wondering if a similar tool exists to optimize hyperparameters for Tensorflow (for instance number of epochs, learning rate, sliding window size etc.)
And if not, how can I implement a snippet that effectively runs all different combinations?
Another viable (and documented) option for grid search with Tensorflow is Ray Tune. It's a scalable framework for hyperparameter tuning, specifically for deep learning/reinforcement learning.
You can try out a fast tutorial here.
It also takes care of Tensorboard logging and efficient search algorithms (ie, HyperOpt
integration and HyperBand) in about 10 lines of Python.
from ray import tune
def train_tf_model(config):
for i in range(num_epochs):
accuracy = train_one_epoch(model)
tune.report(acc=accuracy)
tune.run(train_tf_model,
config={
"alpha": tune.grid_search([0.2, 0.4, 0.6]),
"beta": tune.grid_search([1, 2]),
})
(Disclaimer: I contribute actively to this project!)