I'm trying to make a small neural network in tensorflow and I'm a bit new in this. I saw this in a tutorial (http://de.slideshare.net/tw_dsconf/tensorflow-tutorial) and everything is working fine till I try to optimize the weights (with gradient descent) since I get a Null value.
with tf.Session() as sess:
x = tf.placeholder("float",[1,3],name="x")
w = tf.Variable(tf.random_uniform([3,3]),name="w")
y = tf.matmul(x,w)
labels = tf.placeholder("float",[1,3],name="labels")
relu_out = tf.nn.relu(y)
cross_entropy = tf.nn.softmax_cross_entropy_with_logits(relu_out,labels,name="loss")
optimizer = tf.train.GradientDescentOptimizer(0.5)
train_op = optimizer.minimize(cross_entropy)
e_labels = np.array([[1.0,1.0,0.0]])
sess.run(tf.initialize_all_variables())
for step in range(10):
[out,loss] = sess.run([train_op,cross_entropy],feed_dict={x:np.array([[1.0,2.0,3.0]]),labels: e_labels})
print("the result is:",out)
print("The loss of the function is:",loss)
Till now I changed label values (e_labels) and the input values (x) but anyway I always get a None result. My question is: Is that None Value normal? I don't think so, but if someone could tell me, I would be glad to know what can I do and how to solve it.
I assume you mean that the value of out
(i.e., the first return value from sess.run([train_op, cross_entropy], ...)
) is None
.
This is perfectly normal: train_op
is a tf.Operation
, and when you pass a tf.Operation
to tf.Session.run()
(quoting the docs) "The corresponding fetched value will be None
."
You can think of a tf.Operation
like a function with a void
return type (in a language like C or Java). It's something that you run()
to cause a side effect (i.e., updating the variables) but it doesn't have a meaningful return value itself.