Search code examples
c++video3ddirectx-11hlsl

Does DirectX11 Have Native Support for Rendering to a Video File?


I'm working on a project that needs to write several minutes of DX11 swapchain output to a video file (of any format). I've found lots of resources for writing a completed frame to a texture file with DX11, but the only thing I found relating to a video render output is using FFMPEG to stream the rendered frame, which uses an encoding pattern that doesn't fit my render pipeline and discards the frame immediately after streaming it.

I'm unsure what code I could post that would help answer this, but it might help to know that in this scenario I have a composite Shader Resource View + Render Target View that contains all of the data (in RGBA format) that would be needed for the frame presented to the screen. Currently, it is presented to the screen as a window, but I need to also provide a method to encode the frame (and thousands of subsequent frames) into a video file. I'm using Vertex, Pixel, and Compute shaders in my rendering pipeline.


Solution

  • Found the answer thanks to a friend offline and Simon Mourier's reply! Check out this guide for a nice tutorial on using the Media Foundation API and the Media Sink to encode a data buffer to a video file:

    https://learn.microsoft.com/en-us/windows/win32/medfound/tutorial--using-the-sink-writer-to-encode-video

    Other docs in the same section describe useful info like the different encoding types and what input they need.

    In my case, the best way to go about rendering my composite RTV to a video file was creating a CPU-Accessible buffer, copying the composite resource to it, then accessing the CPU buffer as an array of pixel colors, which media sink understands.