Search code examples
androidsurfaceandroid-camera2

Camera2 API: can it output two cameras into same surface?


I have specific requirement to connect signal from two Cameras into one surface. Each camera would fill half of the surface. The surface would be either displayed or reside in OpenGL texture.

Is this at all possible using Camera2 API? First thing is specifying target rectangle for projection on surface, second thing is if two cameras can use single surface as output.

Reason for this is that our hardware delivers one picture signal split into two Android cameras, and there is need to connect the the picture back in software, so that surface containing picture of both cameras can be possibly saved as video using MediaRecorder.

Thanks


Solution

  • Not directly. A Surface can only be owned by a single producing source at a time, so if you include a particular Surface in a session configuration for camera device 0, trying to use it with camera device 1 at the same time will get you an error in session creation for camera device 1.

    If you want to do this, you'll need to implement your own merger, which takes in buffers from each camera, and composites them into one output.

    The most efficient route for this is likely via the GPU and OpenGL. In short, you'll need to create an OpenGL context and two SurfaceTextures, one for each camera output. Then you can use EGLCreateWindowSurface with the MediaRecorder Surface to create an output EGL window to draw into.

    At that point, you can render with the two input textures into your output surface in any way you want - it sounds like what you want is two side-by-side rectangles, each with one camera's output.

    The details of setting up an Android EGL environment are too long to put here, but there should be plenty of samples of that available.