Search code examples
pythonkerashyperparameters

Keras hyperparameter tuning with hyperas using manual metric


I'm using the hyperas document example to tune the network parameters but based on f1 score instead of accuracy.

I'm using the following implementation for f1 score:

from keras import backend as K

def f1(y_true, y_pred):
    def recall(y_true, y_pred):
        """Recall metric.
        Only computes a batch-wise average of recall.
        Computes the recall, a metric for multi-label classification of
        how many relevant items are selected.
        """
        true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
        possible_positives = K.sum(K.round(K.clip(y_true, 0, 1)))
        recall = true_positives / (possible_positives + K.epsilon())
        return recall

    def precision(y_true, y_pred):
        """Precision metric.
        Only computes a batch-wise average of precision.
        Computes the precision, a metric for multi-label classification of
        how many selected items are relevant.
        """
        true_positives = K.sum(K.round(K.clip(y_true * y_pred, 0, 1)))
        predicted_positives = K.sum(K.round(K.clip(y_pred, 0, 1)))
        precision = true_positives / (predicted_positives + K.epsilon())
        return precision
    precision = precision(y_true, y_pred)
    recall = recall(y_true, y_pred)
    return 2*((precision*recall)/(precision+recall+K.epsilon()))

with updating the metric parameter for compile function in following code line:

model.compile(loss='categorical_crossentropy', metrics=['accuracy'],
                  optimizer={{choice(['rmsprop', 'adam', 'sgd'])}})

to

model.compile(loss='categorical_crossentropy', metrics=[f1],
                  optimizer={{choice(['rmsprop', 'adam', 'sgd'])}})

the above metric works perfectly without using hyperas, while when I try to use it with the tuning process, I get the following error:

Traceback (most recent call last):
  File "D:/path/test.py", line 96, in <module>
    trials=Trials())
  File "C:\Python35\lib\site-packages\hyperas\optim.py", line 67, in minimize
    verbose=verbose)
  File "C:\Python35\lib\site-packages\hyperas\optim.py", line 133, in base_minimizer
    return_argmin=True),
  File "C:\Python35\lib\site-packages\hyperopt\fmin.py", line 367, in fmin
    return_argmin=return_argmin,
  File "C:\Python35\lib\site-packages\hyperopt\base.py", line 635, in fmin
    return_argmin=return_argmin)
  File "C:\Python35\lib\site-packages\hyperopt\fmin.py", line 385, in fmin
    rval.exhaust()
  File "C:\Python35\lib\site-packages\hyperopt\fmin.py", line 244, in exhaust
    self.run(self.max_evals - n_done, block_until_done=self.asynchronous)
  File "C:\Python35\lib\site-packages\hyperopt\fmin.py", line 218, in run
    self.serial_evaluate()
  File "C:\Python35\lib\site-packages\hyperopt\fmin.py", line 137, in serial_evaluate
    result = self.domain.evaluate(spec, ctrl)
  File "C:\Python35\lib\site-packages\hyperopt\base.py", line 840, in evaluate
    rval = self.fn(pyll_rval)
  File "D:\path\temp_model.py", line 86, in keras_fmin_fnct
NameError: name 'f1' is not defined

Solution

  • If you are following the code example you linked to, you are not making hyperas aware of the custom f1 function. The package author provides an example to do that, as well.

    In short, you need to add an additional functions argument to your optim.minimize() call. Something like

        best_run, best_model = optim.minimize(model=model,
         data=data,
         functions=[f1],
         algo=tpe.suggest,
         max_evals=5,
         trials=Trials())
    

    I actually just implemented it today, so I'm confident you can get it working, too :)