Search code examples
grpctensorflow-serving

How to debug "INVALID_ARGUMENT: input tensor alias not found in signature"


I am trying to access my tensorflow serving model which has a signature as can be seen from this code segment:

regression_signature = predict_signature_def(
    inputs={"phase_features" : phase_features},
    outputs={"phase_weight": phase_weight}
)

builder.add_meta_graph_and_variables(
    sess=sess, 
    tags=[tag_constants.SERVING],
    signature_def_map={'regress_phase_weight': regression_signature})
builder.save()

It is loaded in tf serving using a model config file with the following content:

model_config_list: {
  config: {
    name: "PhaseModificationWeightRegressor",
    base_path: "../../models/phase-mod-weights",
    model_platform: "tensorflow"
  }
}

Now I am trying to create a request from Java based on the TensorFlow API and TensorFlow Serving API protos:

...
tensorflow.serving.Model.ModelSpec.Builder modelSped = ModelSpec.newBuilder()
              .setName("PhaseModificationWeightRegressor") // name as defined in the tensorflow serving model config file
              .setSignatureName("regress_phase_weight") // signature name as defined in signature_def_map of add_meta_graph_and_variables of your SavedModelBuilder
              .setVersion(Int64Value.newBuilder().setValue(1)); // model version as indicated by the version of your model when saving it

tensorflow.serving.Predict.PredictRequest.Builder requestBuilder = PredictRequest.newBuilder()
              .setModelSpec(modelSped)
              .putInputs("phase_features", createTensorProto(phaseFeatures));

      return requestBuilder.build();

Unfortunately I get an exception:

Exception in thread "main" io.grpc.StatusRuntimeException: INVALID_ARGUMENT: input tensor alias not found in signature: phase_features

From what I understand, this means it reached the server, FOUND my model named "PhaseModificationWeightRegressor" in version 1, FOUND the signature named "regress_phase_weight", but could not find the logical name (alias) "phase_features" which leads to the appropriate placeholder. Anything else that could be the problem? I am eye-balling this now since 2 hours but can not see the problem/typo etc. that makes this not work.

Any idea what could be wrong? How can I debug this any better? Maybe in the future, TFserving should answer what it expects instead.


Solution

  • I resolved with:

    ‍.putInputs("inputs", createTensorProto(phaseFeatures));
    

    It seems to that the alias to be used from the client is exactly "inputs", while the python code remains the same. With this modification my code works.

    UPDATE: I tested it also with a model with multiple (two) inputs. In this case, you should use the alias you are using while exporting the model.