I'm building a Flask test predictor using AllenNLP.
I'm passing 'passage' and 'question' from a .json file to the predictor.
However, when I pass the json file using curl, it doesn't return a response. Is there a special return in Flask to get it?
Code looks like:
from allennlp.predictors.predictor import Predictor as AllenNLPPredictor
from flask import Flask
from flask import request
app = Flask(__name__)
@app.route("/", methods=['GET','POST'])
def hello():
return "<h1>Test app!</h1>"
class PythonPredictor:
def __init__(self, config):
self.predictor = AllenNLPPredictor.from_path(
"https://storage.googleapis.com/allennlp-public-models/bidaf-elmo-model-2018.11.30-charpad.tar.gz"
)
def predict(self, payload):
if request.method == "POST":
prediction = self.predictor.predict(
passage=payload["passage"], question=payload["question"]
)
return prediction["best_span_str"]
Curl command looks like: curl http://127.0.0.1:5000 -X POST -H "Content-Type: application/json" -d @sample.json
Unless I've misunderstood (I'm guessing you're asking how to obtain the JSON submission in your route, and return the result) it sounds like you need to do something like:
p = PythonPredictor()
@app.route("/", methods=['POST'])
def hello():
data = request.get_json()
result = p.predict(data)
return result
This effectively runs the data in your sample.json
through your PythonPredictor.predict
method, and returns that prediction to the client.
Notice this code creates the instance p
outside the route function, so that the NLP model is only loaded when your flask app starts (not on every request). However it looks like this may re-download that file, unless AllenNLPPredictor.from_path
does some caching, so it would probably be advisable to manually download that file to your own storage first, and load from there in the PythonPredictor.__init__
function.
Let me know if any of this needs clarification, or I've missunderstood.