Search code examples
pythondjangoamazon-web-servicesgoogle-cloud-platformfacebook-prophet

How to serve a Prophet model for my Django application?


I created a model with Facebook Prophet. I wonder now what is the "best" way to access these predictions from an online web application (Django).

Requirements are that I have to train/update my model on a weekly base with data from my Django application (PostgreSQL). The predictions will be saved and I want to be able to call/access this data from my Django application.

After I looked into Google Cloud and AWS I couldn't find any solution that hosts my model in a way that I can just access predictions via an API.

My best idea/approach to solve that right now:

1) Build a Flask application that trains my models on a weekly base. Predictions are saved in a PostgreSQL. The data will be a weekly CSV export from my Django web application.

2) Create an API in my Flask application, that can access predictions from the database.

3) From my Django application, I can call the API and access the data whenever needed.

I am pretty sure my approach sounds bumpy and is probably not the way how it is done. Do you have any feedback or ideas on how to solve it better? In short:

1) Predict data from a PostgresSQL database.

2) Serve predictions in a Django web application.


Solution

  • The simplest way to serve pre-calculated forecast values from Prophet is to serve CSV files from S3 or other file servers. You can refresh your models every few days and write the forecast output to S3

    import boto3
    from io import StringIO
    
    DESTINATION = bucket_name
    
    def write_dataframe_to_csv_on_s3(dataframe, filename):
        """ Write a dataframe to a CSV on S3 """
        print("Writing {} records to {}".format(len(dataframe), filename))
        # Create buffer
        csv_buffer = StringIO()
        # Write dataframe to buffer
        dataframe.to_csv(csv_buffer, sep=",", index=False)
        # Create S3 object
        s3_resource = boto3.resource("s3")
        # Write buffer to S3 object
        s3_resource.Object(DESTINATION, filename).put(Body=csv_buffer.getvalue())
    
    results = forecast[['ds', 'yhat', 'yhat_lower', 'yhat_upper']].copy()
    
    write_dataframe_to_csv_on_s3(results, output+file_name+".csv")