Search code examples
tensorflowtensorboard

What's meaning of "n tensors" in TensorBoard graph?


I'm reading the TensorFlow tutorial code mnist_deep.py and save the graph.

The ouput of scope fc1 should have shape [-1, 1024]. But it's 2 tensors in the graph in TensorBoard.

What's meaning of "n tensors" in TensorBoard graph?

  # Fully connected layer 1 -- after 2 round of downsampling, our 28x28 image
  # is down to 7x7x64 feature maps -- maps this to 1024 features.
  with tf.name_scope('fc1'):
    W_fc1 = weight_variable([7 * 7 * 64, 1024])
    b_fc1 = bias_variable([1024])

    h_pool2_flat = tf.reshape(h_pool2, [-1, 7*7*64])
    h_fc1 = tf.nn.relu(tf.matmul(h_pool2_flat, W_fc1) + b_fc1)

  # Dropout - controls the complexity of the model, prevents co-adaptation of
  # features.
  with tf.name_scope('dropout'):
    keep_prob = tf.placeholder(tf.float32)
    h_fc1_drop = tf.nn.dropout(h_fc1, keep_prob)

enter image description here


Solution

  • It should mean that the output tensor of Relu is used twice in the Droupout node. If you try expanding it you should see the input go into two different nodes.