Search code examples
androidopengl-esandroid-mediacodecvideo-encodingegl

eglSwapBuffers fails with EGL_BAD_SURFACE when using a Surface from MediaCodec


I'm trying to encode a movie using MediaCodec and Surfaces (pixel buffer mode works, but performance is not good enough). However, every time I try to call eglSwapBuffers(), it fails with EGL_BAD_SURFACE and as such, dequeueOutputBuffer() always returns -1 (INFO_TRY_AGAIN_LATER)

I have seen the examples on Bigflake and Grafika and I have another working project where everything is ok, but I need to get this working in another setup which is slightly different.

I currently have a GLSurfaceView which does screen rendering and is supplied with a custom EGLContextFactory/EGLConfigChooser. This allows me to create shared contexts to be used for separate OpenGL rendering in a native library. These are created using EGL10, but this should not be an issue as the underlying contexts only care about the client version, from what I know.

I've made sure the context is recordable, using the following config:

private android.opengl.EGLConfig chooseConfig14(android.opengl.EGLDisplay display) {
        // Configure EGL for recording and OpenGL ES 3.x
        int[] attribList = {
                EGL14.EGL_RED_SIZE, 8,
                EGL14.EGL_GREEN_SIZE, 8,
                EGL14.EGL_BLUE_SIZE, 8,
                EGL14.EGL_ALPHA_SIZE, 8,
                EGL14.EGL_RENDERABLE_TYPE, EGLExt.EGL_OPENGL_ES3_BIT_KHR,
                EGLExt.EGL_RECORDABLE_ANDROID, 1,
                EGL14.EGL_NONE
        };

        android.opengl.EGLConfig[] configs = new android.opengl.EGLConfig[1];
        int[] numConfigs = new int[1];
        if (!EGL14.eglChooseConfig(display, attribList, 0, configs, 0,
                configs.length, numConfigs, 0)) {
            return null;
        }

        return configs[0];
    }

Now, I tried to simplify the scenario, so when recording is started I initialise an instance of MediaCodec as an encoder and call createInputSurface() on the GLSurfaceView's thread. After I have a surface, I turn it into an EGLSurface (EGL14), as follows:

EGLSurface createEGLSurface(Surface surface) {
        if (surface == null) return null;

        int[] surfaceAttribs = { EGL14.EGL_NONE };

        android.opengl.EGLDisplay display = EGL14.eglGetDisplay(EGL14.EGL_DEFAULT_DISPLAY);

        android.opengl.EGLConfig config = chooseConfig14(display);

        EGLSurface eglSurface = EGL14.eglCreateWindowSurface(display, config, surface, surfaceAttribs, 0);

        return eglSurface;
    }

When a new frame arrives from the camera, I send it to the screen and to another class that handles recording. That class just renders it to the EGLSurface built from MediaCodec's input surface, as follows:

public void drawToSurface(EGLSurface targetSurface, int width, int height, long timestampNano, boolean ignoreOrientation) {
        if (mTextures[0] == null) {
            Log.w(TAG, "Attempting to draw without a source texture");
            return;
        }

        EGLContext currentContext = EGL14.eglGetCurrentContext();
        EGLDisplay currentDisplay = EGL14.eglGetCurrentDisplay();

        EGL14.eglMakeCurrent(currentDisplay, targetSurface, targetSurface, currentContext);
        int error = EGL14.eglGetError();

        ShaderProgram program = getProgramForTextureType(mTextures[0].getTextureType());

        program.draw(width, height, TextureRendererView.LayoutType.LINEAR_HORIZONTAL, 0, 1, mTextures[0]);
        error = EGL14.eglGetError();

        EGLExt.eglPresentationTimeANDROID(currentDisplay, targetSurface, timestampNano);
        error = EGL14.eglGetError();

        EGL14.eglSwapBuffers(currentDisplay, targetSurface);
        error = EGL14.eglGetError();

        Log.d(TAG, "drawToSurface");
    }

For some reason, eglSwapBuffers() fails and reports EGL_BAD_SURFACE and I haven't found a way to debug this further.

Update I've tried querying the current surface after the call that makes it current and it always returns a malformed surface (looking inside, I can see the handle is 0 and it always fails when queried). It looks like eglMakeCurrent() silently fails to set the bind the surface to the context.

Moreover, I've determined this issue appears on Qualcomm chips (Adreno), not Kirin, so it's definitely related to the native OpenGL implementation (it's somehow funny because I've always noticed Adreno to be more permissive when it comes to "bad" OpenGL configurations)


Solution

  • Fixed! It turns out EGL10 and EGL14 appear to play nice together, but in some cases fail in very subtle ways, such as the one I encountered.

    In my case, the EGLContextFactory I wrote was creating a base OpenGL ES context using EGL10 and then created more shared contexts on demand, again using EGL10. While I could retrieve them using EGL14 (either in Java or C) and context handles were always correct (sharing textures between contexts worked like a charm), it failed mysteriously when trying to use an EGLSurface created from a context or EGL10 origin... on Adreno chips.

    The solution was to switch the EGLContextFactory to start from a context created with EGL14 and continue to create shared contexts using EGL14. For the GLSurfaceView, which still required EGL10, I had to use a hack

        @Override
        public javax.microedition.khronos.egl.EGLContext createContext(EGL10 egl10, javax.microedition.khronos.egl.EGLDisplay eglDisplay, javax.microedition.khronos.egl.EGLConfig eglConfig) {
            EGLContext context = createContext();
            boolean success = EGL14.eglMakeCurrent(mBaseEGLDisplay, EGL14.EGL_NO_SURFACE, EGL14.EGL_NO_SURFACE, context);
    
            if (!success) {
                int error = EGL14.eglGetError();
                Log.w(TAG, "Failed to create a context. Error: " + error);
            }
    
            javax.microedition.khronos.egl.EGLContext egl14Context = egl10.eglGetCurrentContext(); //get an EGL10 context representation of our EGL14 context
            javax.microedition.khronos.egl.EGLContext trueEGL10Context = egl10.eglCreateContext(eglDisplay, eglConfig, egl14Context, glAttributeList);
    
            destroyContext(context);
            return trueEGL10Context;
        }
    

    What this does is to create a new shared context with EGL14, make it current and then retrieve an EGL10 version of it. That version is not usable directly (for a reason I cannot exactly understand), but another shared context from it works well. The starting EGL14 context can then be destroyed (in my case it's put into a stack a reused later on).

    I would really love to understand why this hack is needed, but I'm glad to have a working solution still.