Search code examples
androidc++ffmpegdji-sdklibstreaming

How to stream live video from DJI Professional 3 camera?


I have to get the live stream video from DJI Phantom 3 camera in my C++ application, in order to do a Computer Vision processing in OpenCV.

First I tried sending the H264 raw data through an UDP socket, inside this callback:

        mReceivedVideoDataCallBack = new CameraReceivedVideoDataCallback() {

        @Override
        public void onResult(byte[] videoBuffer, int size) {
            //Here, I call a method from a class I created, that sends the buffer through UDP
            if (gravar_trigger) controleVideo.enviarFrame(videoBuffer, size);

            if (mCodecManager != null)  mCodecManager.sendDataToDecoder(videoBuffer, size);

        }

    };

That communication above works well. However, I haven't been able to decode that UDP H264 data in my C++ desktop application. I have tested with FFmpeg lib, but couldn't get to alocate an AVPacketwith my UDP data, in order to decode using avcodec_send_packet and avcodec_receive_frame. I also had problems with AVCodecContext, since my UDP communication wasn't a stream like RTSP, where it could get information about its source. Therefore, I had to change how I was trying to solve the problem.

Then, I found libstreaming, in which can be associate to stream the android video camera to a Wowza Server, creating something like a RTSP stream connection, where the data could be obtained in my final C++ application easily using OpenCV videoCapture. However, libstreaming uses its own surfaceView. In other words, I would have to link the libstreaming surfaceView with the DJI Drone's videoSurface. I'm really new to Android, so don't have any clue of how to do that.

To sum up, is that the correct approach? Someone has a better idea? Thanks in advance


Solution

  • After a long time, I finally developed a system that can stream the DJI drone camera correctly

    https://github.com/raullalves/DJI-Drone-Camera-Streaming