When adding/subtracting numbers around 0.002 in a HLSL compute shader, there is some really odd behaviour.
#pragma kernel Fade
RWTexture2D<float4> Result;
[numthreads(8,8,1)]
void Fade (uint3 id : SV_DispatchThreadID)
{
Result[id.xy] = Result[id.xy] + 0.001;
}
Result is the texture that is displayed, it's initially passed in as black. The idea is to achieve a slow fade to white, which works when I set the number to 0.0023 or above but not when below.
Here's a list of cases which result in different behaviour:
When subtracting from a non-black colour:
Any hints towards this would be appreciated.
I found out the answer: for some reason Unity was defaulting to a very limited RenderTextureFormat. i have now modified my render texture definition to read
renderTexture = new RenderTexture(width, height, 32, RenderTextureFormat.ARGBFloat);
And now everything works flawlessly.