I am trying to tune parameters using Hyperas but I can't interpret few details regarding it.
Q1) What is max_eval parameter in optim.minimize do?
Q2) Does it go through each and every combination of parameters for each max_eval and give me best loss based on best of params?
Q3) What if I give max_eval = 5?
Q4) What does best_run and best_model returns after completing all max_evals?
Q5) Below model function I returned loss as -test_acc what does it has to do with tuning parameter and why do we use negative sign there?
def model(x_train, y_train, x_test, y_test):
dense_units1 = {{choice([64, 126, 256, 512])}}
activations = {{choice(['relu', 'sigmoid'])}}
epochs = 100
verbose = 0
model = Sequential([
# layer 1
Dense(dense_units1, activations, input_shape=(784,)),
....
....
....
])
# compiling model
model.compile(optimizers, loss='categorical_crossentropy', metrics=['accuracy'])
# fitting the model
result = model.fit(x_train, y_train, validation_split=0.2, batch_size=batch_size,
epochs=epochs, verbose=verbose, callbacks=[ES, MC])
test_loss, test_acc = model.evaluate(x_test, y_test, batch_size=512)
return {'loss': -test_acc, 'status': STATUS_OK, 'model': model}
best_run, best_model = optim.minimize(model=model, data=dataset, algo=tpe.suggest,
max_evals=5,
trials=Trials(), notebook_name='MNIST',
verbose=True)
The max_eval
parameter is simply the maximum number of optimization runs. (e.g. If max_evals = 5
, Hyperas will choose a different combination of hyperparameters 5 times and run each combination for the amount of epochs you chose)
No, It will go through one combination of hyperparamets for each max_eval
. The best combination of hyperparameters will be after finishing all evaluations you gave in max_eval
parameter.
Answered in Q1.
In this case best_model
and best_run
will return the same. You should add this to your code:
print('Best performing model chosen hyper-parameters:')
print(best_run)
this will print the best hyperparameters from all the runs it made.