I am trying to create a texture2d in d3d11 from std::vector data. This texture is going to be used as the variable rate shading surface texture. For testing purposes the texture is filled with 11s as values. There is a lookup table that maps each 16x16 pixel block to a shading rate. In this case the value 11 results in a 4x4 coarse shading.
But it looks like there are values sampled along the x axis that are not equal to 11 but instead 0. The Vertical sampling looks correct.
D3D11_TEXTURE2D_DESC srsDesc;
ZeroMemory(&srsDesc, sizeof(srsDesc));
srsDesc.Width = (UINT)g_variableShadingGranularity.x;
srsDesc.Height = (UINT)g_variableShadingGranularity.y;
srsDesc.ArraySize = (UINT)1;
srsDesc.MipLevels = (UINT)1;
srsDesc.SampleDesc.Count = (UINT)1;
srsDesc.SampleDesc.Quality = (UINT)0;
srsDesc.Format = DXGI_FORMAT_R8_UINT;
srsDesc.Usage = D3D11_USAGE_DEFAULT;
srsDesc.BindFlags = D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_RENDER_TARGET;
srsDesc.CPUAccessFlags = (UINT)0;
srsDesc.MiscFlags = (UINT)0;
// fill texture
g_srs_texture_data = std::vector<unsigned int>(static_cast<std::size_t>(srsDesc.Width) * srsDesc.Height, 11);
D3D11_SUBRESOURCE_DATA sd = {};
const uint32_t rowPitch = srsDesc.Width * sizeof(unsigned int);
sd.pSysMem = static_cast<const void*>(g_srs_texture_data.data());
sd.SysMemPitch = rowPitch;
HRESULT ok = s_device->CreateTexture2D(&srsDesc, &sd, &g_shadingRateSurface);
To be more precise after every 16x16 pixel block there are 3 black blocks along the x axis. The black blocks indicate that the texture mapped to a "CULL" shading rate for that pixel block. In the lookup table the "CULL" shading rate defined for the value 0. If I change the lookup table such that 0 maps for example to a 4x4 shading rate then the image is rendered completely with 4x4 coarse shading and no black bars appear. This means that there must be 0s sampled from that texture as values even though the texture should only contain 11s.
Any ideas what could be causing this?
I figured out the solution. The data I provide has to be of type unsigned char. I think thats because the texture data is specified as DXGI_FORMAT_R8_UINT which is 1 byte in size whereas unsigned int is 4 bytes in size. The UINT made me actually think its a c++ unsigned int type.