Search code examples
xgboostonnxruntime

Onnx model conversion with mutiple input types


I know that there are not many experts out there that can help with this issue, but we are having some trouble trying to convert an XGBoost ML model to an ONNX ML Model.

When converting with a single input type everything seems to go fine, but when using multiple types.

I get an error that only a single input type is expected.

Do you have an example (python statement) where a xgboost/another model is converted with onnxmltools (with multiple TensorTypes).

For example:

onnxmltools.convert_xgboost(xgb_reg, initial_types=[
('input', FloatTensorType([1, 2])),
('another_input', Int64TensorType([1, 1]))
])

the above statement produces the error Does anyone have an example on how to handle multiple input types?

RuntimeError                              Traceback (most recent call last)
<ipython-input-196-4ad3856a5ad3> in <module>()
      8 xgb_reg.predict(X_heter)
      9 
---> 10onnxmltools.convert_xgboost(xgb_reg, initial_types=[('input', FloatTensorType([1, 2])),('another_input', Int64TensorType([1, 1]))])
/opt/anaconda3/lib/python3.6/site-packages/onnxmltools/convert/main.py in convert_xgboost(*args, **kwargs)
     83     if not utils.keras2onnx_installed():
     84         raise RuntimeError('keras2onnx is not installed. Please install it to use this feature.')
---> 85 
     86     from keras2onnx import convert_tensorflow as convert
     87     return convert(frozen_graph_def, name, input_names, output_names, doc_string,
/opt/anaconda3/lib/python3.6/site-packages/onnxmltools/convert/xgboost/convert.py in convert(model, name, initial_types, doc_string, target_opset, targeted_onnx, custom_conversion_functions, custom_shape_calculators)
     44     return onnx_model
/opt/anaconda3/lib/python3.6/site-packages/onnxconverter_common/topology.py in compile(self)
    676         self._resolve_duplicates()
    677         self._fix_shapes()
--> 678self._infer_all_types()
    679         self._check_structure()
    680 
/opt/anaconda3/lib/python3.6/site-packages/onnxconverter_common/topology.py in _infer_all_types(self)
    551                 pass  # in Keras converter, the shape calculator can be optional.
    552             else:
--> 553operator.infer_types()
    554 
    555     def _resolve_duplicates(self):
/opt/anaconda3/lib/python3.6/site-packages/onnxconverter_common/topology.py in infer_types(self)
    105     def infer_types(self):
    106         # Invoke a core inference function
--> 107get_shape_calculator(self.type)(self)
    108 
    109 
/opt/anaconda3/lib/python3.6/site-packages/onnxconverter_common/shape_calculator.py in calculate_linear_regressor_output_shapes(operator)
     68     shape may be [N, 1].
     69     '''
---> 70check_input_and_output_numbers(operator, input_count_range=1, output_count_range=1)
     71 
     72     N = operator.inputs[0].type.shape[0]
/opt/anaconda3/lib/python3.6/site-packages/onnxconverter_common/utils.py in check_input_and_output_numbers(operator, input_count_range, output_count_range)
    283         raise RuntimeError(
    284             'For operator %s (type: %s), at most %s input(s) is(are) supported but we got %s output(s) which are %s'
--> 285             % (operator.full_name, operator.type, max_input_count, len(operator.inputs), operator.input_full_names))
    286 
    287     if min_output_count is not None and len(operator.outputs) < min_output_count:
RuntimeError: For operator XGBRegressor (type: XGBRegressor), at most 1 input(s) is(are) supported but we got 2 output(s) which are ['input', 'another_input']

Solution

  • Yes, this is an issue if all the inputs are numerical types it does not matter float or int. One workaround is to treat all as float and create a single input node. It will be mxn size matrix input.