Search code examples
openglglewlibaveglvaapi

vaapi Surface to openGl texture


I got video frames decoded with vaapi/ffmpeg into a VASurface. Now I want to render them using an OpenGL texture. I was able to load the frames into software (with vaDeriveImage und vaMapBuffer) and update a texture with the received data. But that was really slow and that is not my goal here. Then I found EGL used in a few other repos.

So I found this repo, which does render the frames entirely with EGL as far as I see. This is not what I want. I need it in texture for later use.

Then I came across fmor's demo program. This looks really like magic to me. There are like 2 steps on init and then he can use the texture with no problems.

//in player.c before the decoding happens:
egl_image = egl_create_image_from_va( &surface, player->video_va_display, player->video_cc->width, player->video_cc->height );
    if( egl_image == EGL_NO_IMAGE )
        goto LBL_FAILED;

    glGenTextures( 1, &player->video_texture );
    glBindTexture( GL_TEXTURE_2D, player->video_texture );
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    glEGLImageTargetTexture2DOES( GL_TEXTURE_2D, egl_image  );

//in util.c:
EGLImage egl_create_image_from_va(VASurfaceID* _va_surface, VADisplay va_display, int width, int height)
{
    EGLImage egl_image;
    VASurfaceID va_surface;
    VASurfaceAttrib va_surface_attrib;
    VADRMPRIMESurfaceDescriptor va_surface_descriptor;
    int r;

    egl_image = EGL_NO_IMAGE;
    va_surface = VA_INVALID_SURFACE;

    va_surface_attrib.type = VASurfaceAttribPixelFormat;
    va_surface_attrib.flags = VA_SURFACE_ATTRIB_SETTABLE;
    va_surface_attrib.value.type = VAGenericValueTypeInteger;
    va_surface_attrib.value.value.i = VA_FOURCC_RGBA;

    r = vaCreateSurfaces( va_display, VA_RT_FORMAT_RGB32, width, height, &va_surface, 1, &va_surface_attrib, 1 );
    if( r != VA_STATUS_SUCCESS )
        goto LBL_FAILED;

    r = vaExportSurfaceHandle( va_display, va_surface, VA_SURFACE_ATTRIB_MEM_TYPE_DRM_PRIME_2, VA_EXPORT_SURFACE_READ_ONLY, &va_surface_descriptor );
    if( r != 0 )
        goto LBL_FAILED;

    EGLAttrib egl_img_attributes[] = {
        EGL_LINUX_DRM_FOURCC_EXT, va_surface_descriptor.layers[0].drm_format,
        EGL_WIDTH, va_surface_descriptor.width,
        EGL_HEIGHT, va_surface_descriptor.height,
        EGL_DMA_BUF_PLANE0_FD_EXT, va_surface_descriptor.objects[va_surface_descriptor.layers[0].object_index[0]].fd,
        EGL_DMA_BUF_PLANE0_OFFSET_EXT, va_surface_descriptor.layers[0].offset[0],
        EGL_DMA_BUF_PLANE0_PITCH_EXT, va_surface_descriptor.layers[0].pitch[0],
        EGL_NONE
    };
    egl_image = eglCreateImage( eglGetCurrentDisplay(), EGL_NO_CONTEXT, EGL_LINUX_DMA_BUF_EXT, NULL, egl_img_attributes );
    if( egl_image == EGL_NO_IMAGE )
        goto LBL_FAILED;


    *_va_surface = va_surface;
    return egl_image;
LBL_FAILED:
    if( va_surface != VA_INVALID_SURFACE )
        vaDestroySurfaces( va_display, &va_surface, 1 );
    return EGL_NO_IMAGE;
}

Can someone tell me what is happening here? And how I can reproduce this without using glew?

My hardest guess is that with the EGL_LINUX_DMA_BUF_EXT parameter for the eglImage creation there is some direct memory accessing ongoing. Is vaapi rendering right into the OpenGL texture here?

Also here is vaExportSurfaceHandle used and I don't really get what this is doing.

Edit: I read through a lot of EGL related posts now. I think I understand a bit more now. But when I looked at fmor's demo program again I got confused. There are several calls of eglGetCurrentDisplay() but I can't find where this Display is set, so I can reproduce that. Could it be, that glew is doing something behind the scenes or am I missing something else?

Even eglInitialze() is never called once.

When I try to instantiate an EGLDisplay on my own with eglGetDisplay(native_display), like it was done in ffvademo, what do I put in for the native_display? In ffvademo there got X11 Display or a DRM Display (?) inserted. Both are as far as I know used to render to the screen, or not? Also I think inserting the vaapi Display would be not the right thing here. I could really use some help here guys...


Solution

  • Ok, I found it. GLFW is doing all of the stuff needed. Gonna read through that code to find my answers I guess.