I'm kind of wondering about this, if you create a texture in memory in DirectX with the CreateTexture function:
HRESULT CreateTexture(
UINT Width,
UINT Height,
UINT Levels,
DWORD Usage,
D3DFORMAT Format,
D3DPOOL Pool,
IDirect3DTexture9** ppTexture,
HANDLE* pSharedHandle
);
...and pass in D3DFMT_UNKNOWN
format what is supposed to happen exactly? If I try to get the surface of the first or second level will it cause an error? Can it fail? Will the graphics device just choose a random format of its choosing? Could this cause problems between different graphics card models/brands?
I just tried it out and it does not fail, mostly
When Usage
is set to D3DUSAGE_RENDERTARGET
or D3DUSAGE_DYNAMIC
, it consistently came out as D3DFMT_A8R8G8B8
, no matter what I did to the back buffer format or other settings. I don't know if that has to do with my graphics card or not. My guess is that specifying unknown means, "pick for me", and that the 32-bit format is easiest for my card.
When the usage was D3DUSAGE_DEPTHSTENCIL
, it failed consistently.
So my best conclusion is that specifying D3DFMT_UNKNOWN
as the format gives DirectX the choice of what it should be. Or perhaps it always just defaults to D3DFMT_A8R8G8B
.
Sadly, I can't confirm any of this in any documentation anywhere. :|