Search code examples
c#wpfrenderingargb

How to render video from ARGB Frames


I'm using the Microsoft.MixedReality.WebRTC library and I am planing on using it for my next project - a Real-Time video chatting app. I have been able to establish a connection and pass video frames around.

How would I properly render those Frames and display them as Video?

Using WPF's MediaElement seems pretty easy, but I can only input an Uri object as source, I cannot feed it single frames, AFAIK.

I have read that drawing Bitmaps is a possible solution, but I am sure this would mean many hours reinventing the wheel and testing, which I am not a fan of doing, unless there is no other way.

The library works as follows: Each time a new frame is received by the client the Argb32VideoFrameReady event is raised. A Argb32VideoFrame struct object is then passed to the callback, which contains an IntPtr to the raw data. Height, Width and Stride are also provided.

More Information on the specific struct here

What would be some ways I could achieve this?

I am planning on using WPF. The solution should target Windows 7+ and .Net Framework 4.6.2.

Thanks in advance.


Solution

  • With an Image element in XAML

    <Image x:Name="image"/>
    

    the simple method below would directly copy the frame into a WriteableBitmap that is assigned to the Image's Source property.

    private void UpdateImage(Argb32VideoFrame frame)
    {
        var bitmap = image.Source as WriteableBitmap;
        var width = (int)frame.width;
        var height = (int)frame.height;
    
        if (bitmap == null ||
            bitmap.PixelWidth != width ||
            bitmap.PixelHeight != height)
        {
            bitmap = new WriteableBitmap(
                width, height, 96, 96, PixelFormats.Bgra32, null);
            image.Source = bitmap;
        }
    
        bitmap.WritePixels(
            new Int32Rect(0, 0, width, height),
            frame.data, height * frame.stride, frame.stride);
    }
    

    ARGBVideoFrame from here: https://github.com/microsoft/MixedReality-WebRTC/blob/master/libs/Microsoft.MixedReality.WebRTC/VideoFrame.cs

    PixelFormats.Bgra32 seems to be the proper format, due to this comment on the struct:

    The ARGB components are in the order of a little endian 32-bit integer, so 0xAARRGGBB, or (B, G, R, A) as a sequence of bytes in memory with B first and A last.