I need to stream frames from a native android Vulkan application of mine, I'm successfully copying data off my framebuffer and the raw frame data is ready for encoding. However I'm torn on what the best next steps would be.
The raw image needs to be encoded to JPEG, then transported via HLS which will end up at some HTTP server which will host the playlist externally.
The main point of contention is how I can encode the raw frame and then serve it to the HTTP server.
I assume I'll need to start a tcp server to send raw frames to ffmpeg/gstreamer pipeline for encoding, but I'm not sure what that command would look like. Or whether there is a better way to do this.
You could use AppSRC to send your buffer to your Gstreamer Pipeline.
You can see information here: https://gstreamer.freedesktop.org/documentation/tutorials/basic/short-cutting-the-pipeline.html?gi-language=c
Here HLS Sink Element https://gstreamer.freedesktop.org/documentation/hls/hlssink2.html?gi-language=c
Are you sure HLS Protocol supports JPEG ?
If you want to transfer a jpeg encoded file here an example : https://gstreamer.freedesktop.org/documentation/curl/curlhttpsink.html?gi-language=c
As a conclusion :
Buffer -> AppSrc -> jpegenc -> jpegparse ! curlhttpsink
For HLS you have hlssink2 element : https://gstreamer.freedesktop.org/documentation/hls/hlssink2.html?gi-language=c
Here how to make a Gstreamer Android App : https://gstreamer.freedesktop.org/documentation/tutorials/android/index.html?gi-language=c