Search code examples
tensorflowtensorboarddeeplab

How to properly use tf.metrics.mean_iou in Tensorflow to show confusion matrix on Tensorboard?


I found evaluation script in Tensorflow official implementation of DeeplabV3+ (eval.py) uses tf.metrics.mean_iou to update mean IOU, and adds it to Tensorboard for record.

tf.metrics.mean_iou actually returns 2 tensors, one is calculated mean IOU, the other is an opdate_op, and according to official doc (doc), confusion matrix. It seems every time if you want to get calculated mean_iou, you have to call that update_op first.

I am trying to add this update_op into summary as a tensor, but it does not work. My question is how to add this confusion matrix into Tensorboard?

I saw some other threads on how to calculate confusion matrix and add it to Tensorboard, with extra operations. I just would like to know if one can do this without those extra operations.

Any help would be appreciated.


Solution

  • I will post my answer here since someone upvoted it.

    Let's say you defined mean_iou op in the following manner:

        miou, update_op = tf.metrics.mean_iou(
            predictions, labels, dataset.num_of_classes, weights=weights)
        tf.summary.scalar(predictions_tag, miou)
    

    If you see your graph in Tensorboard, you will find there is a node named 'mean_iou', and after expanding this node, you will find there is an op called 'total_confucion_matrix'. This is what you will need to calculate recall and precision for each class.

    enter image description here

    After you get the node name, you can add it to your tensorboard via tf.summary.text or print in your terminal bytf.print function. An example is posted below:

        miou, update_op = tf.metrics.mean_iou(
            predictions, labels, dataset.num_of_classes, weights=weights)
        tf.summary.scalar(predictions_tag, miou)
        # Get the correct tensor name of confusion matrix, different graphs may vary
        confusion_matrix = tf.get_default_graph().get_tensor_by_name('mean_iou/total_confusion_matrix:0')
    
        # Calculate precision and recall matrix
        precision = confusion_matrix / tf.reshape(tf.reduce_sum(confusion_matrix, 1), [-1, 1])
        recall = confusion_matrix / tf.reshape(tf.reduce_sum(confusion_matrix, 0), [-1, 1])
    
        # Print precision, recall and miou in terminal
        precision_op = tf.print("Precision:\n", precision,
                             output_stream=sys.stdout)
        recall_op = tf.print("Recall:\n", recall,
                             output_stream=sys.stdout)
        miou_op = tf.print("Miou:\n", miou,
                             output_stream=sys.stdout)
    
        # Add precision and recall matrix in Tensorboard
        tf.summary.text('recall_matrix', tf.dtypes.as_string(recall, precision=4))
        tf.summary.text('precision_matrix', tf.dtypes.as_string(precision, precision=4))
    
        # Create summary hooks
        summary_op = tf.summary.merge_all()
        summary_hook = tf.contrib.training.SummaryAtEndHook(
            log_dir=FLAGS.eval_logdir, summary_op=summary_op)
        precision_op_hook = tf.train.FinalOpsHook(precision_op)
        recall_op_hook = tf.train.FinalOpsHook(recall_op)
        miou_op_hook = tf.train.FinalOpsHook(miou_op)
        hooks = [summary_hook, precision_op_hook, recall_op_hook, miou_op_hook]
    
        num_eval_iters = None
        if FLAGS.max_number_of_evaluations > 0:
          num_eval_iters = FLAGS.max_number_of_evaluations
    
        if FLAGS.quantize_delay_step >= 0:
          tf.contrib.quantize.create_eval_graph()
    
        tf.contrib.training.evaluate_repeatedly(
            master=FLAGS.master,
            checkpoint_dir=FLAGS.checkpoint_dir,
            eval_ops=[update_op],
            max_number_of_evaluations=num_eval_iters,
            hooks=hooks,
            eval_interval_secs=FLAGS.eval_interval_secs)
    
    

    Then you will have your precision and recall matrix summarised in your Tensorboard: enter image description here