I'm rewriting my to code to use tf.estimator.Estimator as an encapsulating object for my models. The problem is : I don't see how typical input pipeline fits into the picture.
My input pipeline use queues which are coordianted by tf.train.Coordinator
.
To satisify tf.estimator.Estimator
requirements i create all the "input graph" in init_fn
function that is passed to estimator when calling:
Estimator.train(...)
It looks like this
input_fn(f)
:
...create input graph...
qr = tf.train.QueueRunner(queue, [operations...])
tf.train.add_queue_runner(qr)
The problem is: in such scenario how can I start and stop queue runners
, respectivly at the start and beginning of the Estimator.train(...)?
Starting
I figured out for starting the queues I can pass and init_fn
that does it to scaffold object passed to Estimator.
However how to join threads and close them gracefully - this I do not know.
Is there reference architecture for proper threaded input pipeline when using tf.estimator.?
Is Estimator class even ready to work with queues?
Estimator
uses tf.train.MonitoredTrainingSession
which handles starting and joining threads. You can check a couple example input-fns, such as
tf.estimator.inputs.*
, tf.contrib.learn.io.read*