Search code examples
pythonml.netonnx

Error while using ONNX model for inferencing in ML.Net 1.4


I am creating my first application in ML.Net. I want to use a model built using sklearn to predict car price given the manufacturing year.

For this I am using simple data set which looks like below.

> id    region  price   year    model   fuel    odometer    transmission
> 7316814884    auburn  33590   2014    sierra 1500 crew cab slt    gas 57923   other
> 7316814758    auburn  22590   2010    silverado 1500  gas 71229   other
> 7316814989    auburn  39590   2020    silverado 1500 crew gas 19160   other

My Python code to train and generate ONNX model

car_data = pd.read_csv('vehicles.csv', header=0, index_col=None)

Input_Cols = car_data[['year']]
Predict_Col = car_data['price']

X_train, X_test, y_train, y_test = train_test_split(Input_Cols, Predict_Col, test_size=0.2)

lrm = LinearRegression()
lrm.fit(X_train, y_train)

from skl2onnx import convert_sklearn
from skl2onnx.common.data_types import FloatTensorType
initial_type = [('year', FloatTensorType([1]))]
onx = convert_sklearn(lrm , initial_types=initial_type)
with open("CarPricePrediction.onnx", "wb") as f:
    f.write(onx.SerializeToString())

My Input & Output data classes in .Net

public class InputData
{
    [ColumnName("year")]
    public float Year{ get; set; }
}
public class Output
{
    [ColumnName("sentence_embedding")]
    public List<float> SentenceEmbedding { get; set; }
}

Main Inference Code

public static class Program
{
    static string ONNX_MODEL_PATH = @".\ONNX\CarPricePrediction.onnx";
    static MLContext mlContext = new MLContext();

    static void Main(string[] args)
    {
        var onnxPredictionPipeline = GetPredictionPipeline(mlContext);

        var onnxPredictionEngine = mlContext.Model.CreatePredictionEngine<InputData, Output>(onnxPredictionPipeline);

        var dt = new List<InputData>() { new InputData(){ Year = 2010 } };
        Output prediction = new Output();
        onnxPredictionEngine.Predict(dt, ref prediction);
        Console.WriteLine($"Predicted Fare: {prediction.SentenceEmbedding.First()}");
    }

    static ITransformer GetPredictionPipeline(MLContext mlContext)
    {
        var inputColumns = new string[] { "year" };
        var outputColumns = new string[] { "variable" };
        var onnxPredictionPipeline = mlContext.Transforms
                                        .ApplyOnnxModel(
                                            outputColumnNames: outputColumns,
                                            inputColumnNames: inputColumns,
                                            ONNX_MODEL_PATH);
        var emptyDv = mlContext.Data.LoadFromEnumerable(new InputData[] { });

        return onnxPredictionPipeline.Fit(emptyDv);
    }
}

I am getting exception on

mlContext.Model.CreatePredictionEngine<InputData, Output>(onnxPredictionPipeline)

Exception


Solution

  • I figured out the issues in my code. Following changes in my code made it work.

    Python Code changes Change

    initial_type = [('year', FloatTensorType([1]))]
    

    to

    initial_type = [('year', FloatTensorType([1,1]))]
    

    C# Changes

    Change this

    public class Output
    {
        [ColumnName("sentence_embedding")]
        public List<float> SentenceEmbedding { get; set; }
    }
    

    to

    public class Output
    {
            [ColumnName("variable")]
            public float []Value { get; set; }
    }
    

    And my code started working end to end.