I'm currently working on a project involving a small robotic car using the Raspberry Pi. In order to use the picamera efficiently (both for image processing and a live webstream), I would like to use the YUV420 format (which is supported by picamera). This allows me to directly use the Y values for image processing and leaving any further conversion to the client.
Is it possible to quickly stream this array (with a Python generator) through a Flask Response object (as shown here with JPEG: http://blog.miguelgrinberg.com/post/video-streaming-with-flask)? If so, I know I can convert the YUV data to RGB in JavaScript and then draw it to a canvas, but how do I access the YUV stream one frame at a time?
If there are any other, more efficient solutions (while sticking to Flask), I'd like to hear about them too. The Flask Response with Python generator just worked like a charm for a JPEG webstream.
Looking at the picamera
documentation and the blog post you've linked, you could create a multipart response consisting of a stream of YUV420 frames captured from the camera with code like this:
def gen():
with picamera.PiCamera() as camera:
camera.start_preview()
# Camera warm-up time
time.sleep(2)
while True:
stream = io.BytesIO()
camera.capture(stream, 'yuv')
yield stream.getvalue()
#perhaps add a time.sleep() here to enforce a constant framerate?
However, that doesn't address the client-side. Because you want to use the image data in JavaScript, rather than just displaying it in an <img>
tag, you'd need to fetch it from JavaScript with XMLHTTPRequest, and XMLHTTPRequest doesn't support multipart responses (specifically, Firefox used to and no longer does).
The better approach would be to use WebSocket, since then it would be easy to open a WebSocket connection from JavaScript, read each frame in turn (each frame being sent in its own WebSocket message), and perform the necessary image processing. But what about the server-side? Flask-Sockets
looks like it will do the trick, and then sending the stream of frames would look something like this:
@sockets.route('/stream')
def stream_socket(ws):
with picamera.PiCamera() as camera:
camera.start_preview()
# Camera warm-up time
time.sleep(2)
while not ws.closed:
stream = io.BytesIO()
camera.capture(stream, 'yuv')
ws.send(stream.getvalue())
#perhaps add a time.sleep() here to enforce a constant framerate?