I am maintaining API server for my company which runs a python flask app in uwsgi on top of nginx.
...
@app.route('/getquick', methods=["GET"])
def GET_GET_IP_DATA():
sp_final = "CALL sp_quick()"
cursor.execute(sp_final)
@app.route('/get_massive_log', methods=["POST"])
def get_massive_log():
sp_final = "CALL sp_slow()"
cursor.execute(sp_final)
...
While the first request /getquick gets processed very quickly, /get_massive_log can take up to five seconds due to a rather long and complex mySQL query. The server can handle few of these queries but starts creating broken pipe errors when called to much.
The problem is, the other /getquick requests get blocked by these long I/O requests.
My manager suggested that I use gevent to somehow free up the server to process the other requests while waiting for the mySQL queries, but I am not sure if I am looking in the correct direction.
I am using pymysql to run queries, which google seems to suggest to work with gevent on top of uwsgi, but I have not been able to produce better results with it.
I have googled for days now, and while I am trying to understand threads, concurrency, asynchronous requests, I don't know where to start digging to find a solution. Is it even possible? Any suggestions or even pointers to where to research would be greatly appreciated.
EDIT : Perhaps my questions wasn't too clear, so I'll try to restate it:
What's the best way to free up workers for processing other requests while waiting for long database queries with uwsgi?
You need to learn about Uwsgi offloading
Offloading is a way to optimize tiny tasks, delegating them to one or more threads.
These threads run such tasks in a non-blocking/evented way allowing for a huge amount of concurrency.
You can read about offloading subsystem in the docs