Search code examples
tensorflowtensorflow-estimator

Unable to use core Estimator with contrib Predictor


I'm using canned estimators and are struggling with poor predict performance so I'm trying to use tf.contrib.predictor to improve my inference performance. I've made this minimalistic example to reproduce my problems:

import tensorflow as tf
from tensorflow.contrib import predictor

def serving_input_fn():
  x = tf.placeholder(dtype=tf.string, shape=[1], name='x')
  inputs = {'x': x }
  return tf.estimator.export.ServingInputReceiver(inputs, inputs)

input_feature_column = tf.feature_column.numeric_column('x', shape=[1])
estimator = tf.estimator.DNNRegressor(
    feature_columns=[input_feature_column],
    hidden_units=[10, 20, 10],
    model_dir="model_dir\\predictor-test")

estimator_predictor = predictor.from_estimator(estimator, serving_input_fn)

estimator_predictor({"inputs": ["1.0"]})

This yields the following exception:

UnimplementedError (see above for traceback): Cast string to float is not supported
[[Node: dnn/input_from_feature_columns/input_layer/x/ToFloat = Cast[DstT=DT_FLOAT, SrcT=DT_STRING, _device="/job:localhost/replica:0/task:0/device:CPU:0"](dnn/input_from_feature_columns/input_layer/x/ExpandDims)]]

I've tried using tf.estimator.export.TensorServingInputReceiver instead of ServingInputReceiver in my serving_input_fn(), so that I can feed my model with a numerical tensor which is what I want:

def serving_input_fn():
  x = tf.placeholder(dtype=tf.float32, shape=[1], name='x')
  return tf.estimator.export.TensorServingInputReceiver(x, x)

but then I get the following exception in my predictor.from_estimator() call:

ValueError: features should be a dictionary of Tensors. Given type: <class 'tensorflow.python.framework.ops.Tensor'>

Any ideas?


Solution

  • My understanding of all of this is not really solid but I got it working and given the size of the community, I'll try to share what I did.

    First, I'm running tensorflow 1.5 binaries with this patch applied manually.

    The exact code I'm running is this:

    def serving_input_fn():
        x = tf.placeholder(dtype=tf.float32, shape=[3500], name='x')
        inputs = {'x': x }
    
        return tf.estimator.export.ServingInputReceiver(inputs, inputs)
    
    estimator = tf.estimator.Estimator(
        model_fn=model_fn,
        model_dir="{}/model_dir_{}/model.ckpt-103712".format(script_dir, 3))
    
    estimator_predictor = tf.contrib.predictor.from_estimator(
                                estimator, serving_input_fn)
    
    p = estimator_predictor(
            {"x": np.array(sample.normalized.input_data)})
    

    My case is a bit different than your example because I'm using a custom Estimator but in your case, I guess you should try something like this:

    def serving_input_fn():
      x = tf.placeholder(dtype=tf.float32, shape=[1], name='x')
      inputs = {'x': x }
    
      return tf.estimator.export.ServingInputReceiver(inputs, inputs)
    
    estimator = ...
    
    estimator_predictor = tf.contrib.predictor.from_estimator(
                                estimator, serving_input_fn)
    
    estimator_predictor({"x": [1.0]})