Search code examples
apache-sparkmachine-learningdatabricksmlflow

Serving multiple ML models using mlflow in a single VM


I have setup an mlflow service in a VM and I am able to serve the model using mlflow serve command. Wanted to know if we can host multiple models in a single VM ?

I am using the below command to serve a model using mlflow in a vm.

command:

/mlflow models serve -m models:/$Model-Name/$Version --no-conda -p 443 -h 0.0.0.0

Above command creates a model serving and runs it on 443 port. Is it possible to have an endpoint like below being created with model name in it ?

Current URL: https://localhost:443/invocations

Expected URL: https://localhost:443/model-name/invocations ?


Solution

  • I believe that mlflow models serve will only accept POST input to the /invocations path.
    If you want something custom I would suggest:

    1. Seldon
    2. Create a simple Flask app to do it as illustrated in this blog post.