Search code examples
pythontensorflowsummary

How to use several summary collections in Tensorflow?


I have 2 distinctive groups of summaries. One is collected once per batch another one is collected once per epoch. How can I use merge_all_summaries(key='???') to collect summaries in this two groups separately? Doing it manually is always an option but there seems to be a better way.

Illustration of how i think it should work:

      # once per batch 
      tf.scalar_summary("loss", graph.loss)
      tf.scalar_summary("batch_acc", batch_accuracy)
      # once per epoch
      gradients = tf.gradients(graph.loss, [W, D])
      tf.histogram_summary("embedding/W", W, collections='per_epoch')
      tf.histogram_summary("embedding/D", D, collections='per_epoch')

      tf.merge_all_summaries()                # -> (MergeSummary...) :)
      tf.merge_all_summaries(key='per_epoch') # -> NONE              :(

Solution

  • Problem solved. collections parameter of a summary is supposed to be a list. Solution:

      # once per batch 
      tf.scalar_summary("loss", graph.loss)
      tf.scalar_summary("batch_acc", batch_accuracy)
      # once per epoch
      tf.histogram_summary("embedding/W", W, collections=['per_epoch'])
      tf.histogram_summary("embedding/D", D, collections=['per_epoch'])
    
      tf.merge_all_summaries()                # -> (MergeSummary...) :)
      tf.merge_all_summaries(key='per_epoch') # -> (MergeSummary...) :)
    

    Edit. Syntactical change in TF:

    # once per batch 
      tf.summary.scalar("loss", graph.loss)
      tf.summary.scalar("batch_acc", batch_accuracy)
      # once per epoch
      tf.summary.histogram("embedding/W", W, collections=['per_epoch'])
      tf.summary.histogram("embedding/D", D, collections=['per_epoch'])
    
      tf.summary.merge_all()                # -> (MergeSummary...) :)
      tf.summary.merge_all(key='per_epoch') # -> (MergeSummary...) :)