Search code examples
iosmachine-learningcoremlcoremltools

Load a heavy CoreML model from a remote source


We have a situation where we have a heavy CoreML model (170MB~) that we want to include in our iOS app.

Since we don't want the app size to be that large, we created a smaller model (that has lesser performance) that we can include directly and our intention is the download the heavy model upon app start and switch between the two when the heavy model is downloaded.

Our initial thought was to go to Apple's CoreML Model Deployment solution but it quickly turned out to be impossible for us as Apple requires MLModel archives to be up to 50MB.
So the question is, is there an alternative solution to loading a CoreML model from a remote source, similar to Apple's solution, and how would one implement it?

Any help would be appreciated. Thanks!


Solution

  • Put the mlmodel file on a server you own, download it into the app's Documents folder using your favorite method, create a URL to the downloaded file, use MLModel.compileModel(:at) to compile it, initialize the MLModel (or the automatically generated class) using the compiled model.