Search code examples
machine-learningneural-networkgrid-search

using continious values for hyperparameters in grid search (ANN)


I'm trying to tune hyperparameters in a neural network (regression problem), and i have few questions:

  1. which order should i use in automatic optimisation methods (grid , random , bayesian , genetics, ...)
  2. i started with grid search to get an idea of the learning and i know grid give us optimal result but its a time consuming , i dont have problem with the time so i want to try the best search space but i only know how to choose a discret values for a hyperparameter and i dont know how to give a certain hyperparameter a continious values to test ,ex: i want to test the epoch values between (500 and 10000) with a step of 200 , and for the learning rate between 0.001 and 0.9,so how can i achieve that in grid search or any other optimisation method in ANN.

Solution

  • You should have a look to Tune, which is based on Ray. It offers several powerful algorithms to tune both continuous and discrete parameters using grid search or more advanced strategies such as population-based evolution strategies. Plus, it is rather easy to use.