I ran the following code to
W = tf.Variable(tf.zeros([1, 3]), dtype=tf.float32, name="W")
B = tf.constant([[1, 2, 3]], dtype=tf.float32, name="B")
act = tf.add(W, B)
init = tf.global_variables_initializer()
with tf.Session() as sess:
sess.run(init)
sess.run(act)
writer = tf.summary.FileWriter("./graphs", sess.graph)
writer.close()
And verified it with tensorboard:
What confuses me is the read
operation and the operation prior to that which is denoted as (W)
. Constant B
is directed straight to the Add
operation while tf.variable
has all these operation nodes inside. Here are my questions:
What is (W)
operation? Constant B
is a regular circle which denotes a constant. Oval shaped nodes denote Operation node. (W)
doesn't seem like any operation yet it is denoted with the same oval shaped node? What is that node's job?
Add
node explicitly reads (W)
node with a read
operation as opposed to constant node B
. Why is read
necessary for variable nodes?
W
operation is tf.Variable
you declared here: W = tf.Variable(tf.zeros([1, 3]), dtype=tf.float32, name="W")
. Behind the scenes it does many operations (like W.initializer
- your init op,
W.value()
your read op, W.assign()
to assign the value to itself, and may be more). Also you see your initial value zeros
.
All of this is internal to the tf.Variable and you should not worry about it. This is why all of this was folded (abstracted away) from you behind the big red border.