I've got few files with different files:
darkflow
that relies on the graph mode During darkflow's TFNet initialization I get this error:
Traceback (most recent call last):
File "/home/justin/Projects/comp3931/main.py", line 6, in <module>
watcher = Watcher('res/vid/planet_earth_s01e01/video.mp4', 'res/vid/planet_earth_s01e01/english.srt')
File "/home/justin/Projects/comp3931/watch.py", line 9, in __init__
self.detector = Detector()
File "/home/justin/Projects/comp3931/detect.py", line 6, in __init__
self.tfnet = TFNet(self.options)
File "/usr/local/lib64/python3.6/site-packages/darkflow/net/build.py", line 75, in __init__
self.build_forward()
File "/usr/local/lib64/python3.6/site-packages/darkflow/net/build.py", line 105, in build_forward
self.inp = tf.placeholder(tf.float32, inp_size, 'input')
File "/usr/local/lib/python3.6/site-packages/tensorflow/python/ops/array_ops.py", line 1677, in placeholder
raise RuntimeError("tf.placeholder() is not compatible with "
RuntimeError: tf.placeholder() is not compatible with eager execution.
So, I assume that when I instantiate Translator
class from translate.py
file it invokes eager execution on the whole program, which then is not compatible with calls to darkflow's TFNet
class used in Dectector
class from detect.py
If I run translate.py
independently from others it works fine, other modules also work fine if run them without translate.py
involved.
I guess the fact that they use different contexts (graph/eager), the whole thing can't run together in the same program. I've tried looking at the documentation, but could not find a way to switch back to graph mode when needed.
Is there any way I can run both eager and graph modes in the same application in different places?
It is best to write code that's compatible with both graph mode and eager execution. From the documentation:
- Use tf.data for input processing instead of queues. It's faster and easier.
- Use object-oriented layer APIs—like tf.keras.layers and tf.keras.Model—since they have explicit storage for variables.
- Most model code works the same during eager and graph execution, but there are exceptions. (For example, dynamic models using Python control flow to change the computation based on inputs.)
- Once eager execution is enabled with tf.enable_eager_execution, it cannot be turned off. Start a new Python session to return to graph execution.
That said, it is possible to use eager execution while in graph mode by using tfe.py_func()
. Here is the code example from the documentation (I just added the imports and asserts):
import tensorflow as tf
import tensorflow.contrib.eager as tfe
def my_py_func(x):
assert tf.executing_eagerly()
x = tf.matmul(x, x) # You can use tf ops
print(x) # but it's eager!
return x
assert not tf.executing_eagerly()
with tf.Session() as sess:
x = tf.placeholder(dtype=tf.float32)
# Call eager function in graph!
pf = tfe.py_func(my_py_func, [x], tf.float32)
sess.run(pf, feed_dict={x: [[2.0]]}) # [[4.0]]
The reverse is also possible, as Alex Passos explains in this video. Here is an example inspired by the one in the video:
import tensorflow as tf
import tensorflow.contrib.eager as tfe
tf.enable_eager_execution()
def my_graph_func(x):
assert not tf.executing_eagerly()
w = tfe.Variable(2.0)
b = tfe.Variable(4.0)
return x * w + b
assert tf.executing_eagerly()
g = tfe.make_template("g", my_graph_func, create_graph_function_=True)
print(g(3))
There's also an unofficial way to switch modes, using the eager_mode
and graph_mode
contexts defined in tensorflow.python.eager.context
like this:
import tensorflow as tf
import tensorflow.contrib.eager as tfe
from tensorflow.python.eager.context import eager_mode, graph_mode
with eager_mode():
print("Eager mode")
assert tf.executing_eagerly()
x1 = tfe.Variable(5.0)
print(x1.numpy())
print()
with graph_mode():
print("Graph mode")
assert not tf.executing_eagerly()
x2 = tfe.Variable(5.0)
with tf.Session():
x2.initializer.run()
print(x2.eval())
As it is not official, you should probably avoid it in production code, but it may come in handy when debugging, or in a Jupyter notebook. One last option is to use this switch_to()
function:
import tensorflow as tf
import tensorflow.contrib.eager as tfe
from tensorflow.python.eager.context import context, EAGER_MODE, GRAPH_MODE
def switch_to(mode):
ctx = context()._eager_context
ctx.mode = mode
ctx.is_eager = mode == EAGER_MODE
switch_to(EAGER_MODE)
assert tf.executing_eagerly()
v = tfe.Variable(3.0)
print(v.numpy())
assert tf.get_default_graph().get_operations() == []
switch_to(GRAPH_MODE)
assert not tf.executing_eagerly()
v = tfe.Variable(3.0)
init = tf.global_variables_initializer()
assert len(tf.get_default_graph().get_operations()) > 0
with tf.Session():
init.run()
print(v.eval())
It is really a hack, but it may be useful in a Jupyter notebook, if you don't like nesting all your code in with
blocks.