Search code examples
c++videodirectxdirectshowdirectx-11

DirectShow Pixels Are Out of Order When Capturing NTSC Analog Board


I am writing a custom video rendering filter for Directshow. My renderer assumes the incoming pixels are organized one row of pixels at a time (correct assumption?) and blits them to another DirectX display elsewhere using a DirectX texture.

This approach works with webcams as input, but when I use an analog capture board, the samples the renderer receives are not in any expected order (see left image below). When I render the capture using the stock DirectShow video renderer, it looks fine (see right image below). So the directshow renderer must be doing something extra that my renderer is not. Any idea what it is?

Some more details:

  • The capture card is NTSC, I'm not sure if that would matter.
  • As input to the custom renderer, I am accepting only MEDIASUBTYPE_RGB24, so I do not think that this is a YUV issue (is it?).
  • It's a bit hard to see, but the second image below is my filter graph. My custom renderer connects to the color space converter on the far right.
  • I assume that the pixels coming into my renderer, are all organized one row of pixels at a time. Is this a correct assumption?

sample output

My Rendering Graph


Solution

  • Maybe texture is padded to keep rows aligned at (multiply of) 32 bytes per row? Mind you that I did not ever use DirectShow but that's what I would expect in D3D.

    In other words, your input might have different stride than you think. Unfortunately I do not know DS so I can only assume that something that computes input / output coordinates should have different stride factor e.g. something in code that looks like this offset = y * stride + x.