For a research project involving Optical Camera Communication (OCC), I need to actively analyze video frames, which I am currently doing offline on my pc using python and opencv (I record the video with my smartphone, then process it afterwards). I want to process the video live by using the camera on my samsung galaxy a40. I am fairly new to android development, so I just want to create a basic application for capturing video and transmitting it to my pc for analysis. My application requires 1080p captures at 30 fps, or better.
I am currently using the Android Camerax API which provides the imageAnalysis use case, which gives me access to the raw images planes: I get an ImageProxy in YUV_420_888 format.
I am still familiarizing myself with the API but have been able to reproduce some basic applications based on online examples. I have one concrete questions left:
What is the best way to serialize and transmit the images to my python application on pc? I was planning on creating a simple TCP connection using sockets, converting the image to bitmap and directly sending it like this. I am however not certain if this approach is very efficient, and I don't know how to convert the image and buffer it. Code examples are certainly welcome. A wired connection (USB cable to my pc) is also possible, but I didn't find any support for that.
Any ideas are welcome! Thanks in advance.
If you really care about performance you should use UDP instead of TCP. Just keep in mind that UDP doesn't guarantee that the packets your sent arrive in the same order on the receiver side or if at all and the max packet size you can send in a single UDP message is 65535 bytes. So it's up on you to implement some kind of logic for reordering packets on the receiver side and to tell the sender that you missed something.
In general, real time streaming apps like Stadia, PS Remote Play etc. use some kind of FEC (forward error correction) mechanisms so that receivers can restore missing packets by them self. For my streaming app I used jerasure. But using FEC is optional of course.
The format YUV_420_888 is also not very optimal in regards of compression, for streaming I would suggest something like h.264 or VP8. You will find some examples on the internet, e.g. (The project is a bit old but it might be helpful to get the idea)
https://github.com/bytestar/android-h264-stream-demo
A very good open source streaming app is moonlight. It is also available for Android, which is definitely worth looking at and seeing how things work.
Some stuff might be a overkill for your project but at least I would recommend to use UDP and split up one large YUV_420_888 frame into several smaller ones. In all these small packets include following information for the receiver
The reason for splitting up a large YUV_420_888 frame is because to avoid MTU problems (messages larger than the current default MTU value will be discarded, usually something around 1500 bytes).