Search code examples
tensorflowkerasloss

keras tensorflow metric is loss always calculated


I came across this page. It defines METRICS as below. My questions are

METRICS = [
          keras.metrics.TruePositives(name='tp'),
          keras.metrics.FalsePositives(name='fp'),
          keras.metrics.TrueNegatives(name='tn'),
          keras.metrics.FalseNegatives(name='fn'), 
          keras.metrics.BinaryAccuracy(name='accuracy'),
          keras.metrics.Precision(name='precision'),
          keras.metrics.Recall(name='recall'),
          keras.metrics.AUC(name='auc'),
    ]

Train on 182276 samples, validate on 45569 samples
    Epoch 1/100
    182276/182276 [==============================] - 2s 12us/sample - loss: 0.0139 - tp: 7.0000 - fp: 124.0000 - tn: 181835.0000 - fn: 310.0000 - accuracy: 0.9976 - precision: 0.0534 - recall: 0.0221 - auc: 0.7262 - val_loss: 0.0074 - val_tp: 4.0000 - val_fp: 0.0000e+00 - val_tn: 45492.0000 - val_fn: 73.0000 - val_accuracy: 0.9984 - val_precision: 1.0000 - val_recall: 0.0519 - val_auc: 0.8742
    Epoch 2/100
    182276/182276 [==============================] - 0s 3us/sample - loss: 0.0076 - tp: 91.0000 - fp: 30.0000 - tn: 181929.0000 - fn: 226.0000 - accuracy: 0.9986 - precision: 0.7521 - recall: 0.2871 - auc: 0.8828 - val_loss: 0.0053 - val_tp: 39.0000 - val_fp: 7.0000 - val_tn: 45485.0000 - val_fn: 38.0000 - val_accuracy: 0.9990 - val_precision: 0.8478 - val_recall: 0.5065 - val_auc: 0.8761
    Epoch 3/100
    182276/182276 [==============================] - 0s 3us/sample - loss: 0.0064 - tp: 146.0000 - fp: 36.0000 - tn: 181923.0000 - fn: 171.0000 - accuracy: 0.9989 - precision: 0.8022 - recall: 0.4606 - auc: 0.8981 - val_loss: 0.0049 - val_tp: 45.0000 - val_fp: 7.0000 - val_tn: 45485.0000 - val_fn: 32.0000 - val_accuracy: 0.9991 - val_precision: 0.8654 - val_recall: 0.5844 - val_auc: 0.8828
  1. Why loss is displayed after each epoch if it is not part of METRICS. Is loss a default option? would it be present for regression or multi class classification too?
  2. Keras displays each METRICS for training and validation data. Is it because when we fit the model we provide validation data validation_data=(val_features, val_labels)? if we dont provide validation data, would that give an error as it couldnt print metrics for validation data?

Solution

  • 1 - It is default, unless you put verbose=0 then there is nothing. Yes in all case it is present

    2 - Yes, if you don't provide validation_data then you won't have anything on val metrics