Search code examples
androidc++opencvnativecamera2

Android camera2 API - Display processed frame in real time


I'm trying to create an app that processes camera images in real time and displays them on screen. I'm using the camera2 API. I have created a native library to process the images using OpenCV.

So far I have managed to set up an ImageReader that receives images in YUV_420_888 format like this.

        mImageReader = ImageReader.newInstance(
                mPreviewSize.getWidth(),
                mPreviewSize.getHeight(),
                ImageFormat.YUV_420_888,
                4);
        mImageReader.setOnImageAvailableListener(mOnImageAvailableListener, mImageReaderHandler);

From there I'm able to get the image planes (Y, U and V), get their ByteBuffer objects and pass them to my native function. This happens in the mOnImageAvailableListener:

        Image image = reader.acquireLatestImage();

        Image.Plane[] planes = image.getPlanes();
        Image.Plane YPlane = planes[0];
        Image.Plane UPlane = planes[1];
        Image.Plane VPlane = planes[2];

        ByteBuffer YPlaneBuffer = YPlane.getBuffer();
        ByteBuffer UPlaneBuffer = UPlane.getBuffer();
        ByteBuffer VPlaneBuffer = VPlane.getBuffer();

        myNativeMethod(YPlaneBuffer, UPlaneBuffer, VPlaneBuffer, w, h);

        image.close();

On the native side I'm able to get the data pointers from the buffers, create a cv::Mat from the data and perform the image processing.

Now the next step would be to show my processed output, but I'm unsure how to show my processed image. Any help would be greatly appreciated.


Solution

  • I've tried the ANativeWindow approach, but it's a pain to set up and I haven't managed to do it correctly. In the end I just gave up and imported OpenCV4Android library which simplifies things by converting camera data to a RGBA Mat behind the scenes.