I am trying to tune hyperparameters with KerasTuner, the neural network has two outputs, I have been recording errors for each output. E.g. model = tf.keras.models.Model(inputs=inputs, outputs=[out1, out2])
The tuning process is here:
tuner = keras_tuner.BayesianOptimization(
hypermodel=wrapped_model,
objective="mae",
max_trials=max_trials, overwrite=True, hyperparameters=hyperparameters)
The code works when I have a single output neural network, but with two outputs it seems to give KeyError: 'mae'
.
The architecture is below
optimizer = tf.keras.optimizers.Adam(learning_rate=hp_lr,
beta_1=0.9,beta_2=0.999,epsilon=1e-07,decay=0)
# compile model
model.compile(loss='mae',optimizer=optimizer,metrics=['mae','mse','mape'])
During the tuning process, usually after each trial it shows the "best mae so far" after each trial (it does this in the single output version). But here it says None.
"mae"
is not a valid input for objective
. From the documentation here:
The objective name should be consistent with the one you use as the key in the logs passed to the 'on_epoch_end()' method of the callbacks.
The default callback is tf.keras.callbacks.History()
and if you're trying to minimize the loss function using your validation split, the correct objective name should be "val_loss"
.