I try to implement "streaming contents" on my Pythonanywhere account.
It looks more or less to what is shown there: cf. http://flask.pocoo.org/docs/0.10/patterns/streaming/
except that my view is calculating a complex process for maybe one minute and yields its data to my template, where a script is supposed to update some progress bars (''source.onmessage'').
This works perfectly on my development machine, but not on my pythonanywhere account. On this server, the process looks jammed (progress bars are never updated, except at the very end where the suddunly grow from 0% to 100%), although everything goes well under the hood, e.g. my print
statements are correctly rendered into my server logs).
In the snippet quoted above, there is a note:
Note though that some WSGI middlewares might break streaming, so be careful there in debug environments with profilers and other things you might have enabled.
Could it be the problem here? and would there be a workaround?
JS code from my jinja2 template:
<script type="text/javascript">
/* progress bar */
var source = new EventSource("{{ url_for('BP.run', mylongprocess_id=mylongprocess_id) }}");
source.onmessage = function(event) {
console.log(event.data);
var data = event.data.split("!!");
var nodeid = data[0];
var process = data[1];
var process_status = data[2];
var postpro = data[3];
var postpro_status = data[4];
$('.pb1').css('width', process+'%').attr('aria-valuenow', process);
$('.pb2').css('width', postpro+'%').attr('aria-valuenow', process);
document.getElementById("process_status").innerHTML = process_status;
document.getElementById("postpro_status").innerHTML = postpro_status;
document.getElementById("nodeid").innerHTML = nodeid;
if (postpro >= 100) {
setTimeout(function() {
console.log("progress is finished!");
document.getElementById("status").innerHTML = "redirecting to {{url_for('.view_sonix_result', mylongprocess_id=mylongprocess_id)}}";
window.location.replace("{{url_for('.terminate_analysis', mylongprocess_id=mylongprocess_id)}}");
}, 2); // / setTimeout function
} // /if
else {
document.getElementById("status").innerHTML = "pending...";
} // /else
} // /function
</script>
My (simplified) view:
@BP.route('/run/<int:mylongprocess_id>')
@login_required
def run(mylongprocess_id):
mylongprocess = mylongprocess.query.get_or_404(mylongprocess_id)
project = Project.query.get_or_404(mylongprocess.project_id)
check_rights(current_user, project, 'user', 404)
A, lcs = _create_analysis(mylongprocess)
@copy_current_request_context
def gen(mylongprocess, nodeid, store_path):
print('now runing %s' % A)
for (loopnb, total_loops, pct, lclabel) in A.runiterator(lcs):
print('ran %d/%d (%.1f%%) "%s"' % (loopnb, total_loops,
pct, lclabel))
progress = ('data: %s!!%f!!%s!!%f!!%s\n\n' %
(nodeid, pct, lclabel, 0, 'waiting...'))
yield progress
print('now postprocessing %s' % A)
postpro = load_node(store_path, node_id=nodeid)
for step, total, pct, action in postpro._builditer(target='web',
buildfile=None):
progress = ('data: %s!!%f!!%s!!%f!!%s\n\n' %
(nodeid, 100, 'ok', pct, action.replace('_', ' ')))
yield progress
print('now terminating %s' % A)
_terminate_analysis(A, mylongprocess)
return Response(gen(mylongprocess, mylongprocess.nodeid), mimetype='text/event-stream')
Your traffic goes through an nginx proxy when it is hosted on PythonAnywhere and nginx buffers the response unless specified otherwise.
To get everything to flush,
response.headers['X-Accel-Buffering'] = 'no'
'\n'
at the end of the string you are yielding because python also buffers till end of line.