Search code examples
unity-game-engineopenglshadershader-graph

Difference between world-normal calculated by sampling view-normal texture and world-normal texture


I have a custom renderer feature that create normal texture and using that. Normal texture can be view or world space. If I create view space normal texture and sample with the uv node of the full screen shader graph, then transform that normal to world space(by unity_CameraToWorld or UNITY_MATRIX_I_V), and compare that with world space normal texture, but they are different(The opposite is also different).

I use an orthographic camera rotated x 30, y 45. Could it be related?

view normal texture to world normal view normal texture to world normal

world normal texture world normal texture

Add detail: Normal texture created by just normal node in shader graph, So I can create world or view space normal texture. In my custom renderer feature, Using this shader graph, create normal texture and set this texture to another full screen shader graph's property. In the full screen shader graph, Normal textures are generated properly and sample this texture to get r,g,b as vector3 ViewSpaceNormal. Then doWorldSpaceNormal = mul((float3x3)unity_CameraToWorld, ViewSpaceNormal). I think, the UNITY_MATRIX_I_V and unity_CameraToWorld matrices seem to be different from the expected view to world matrices in full screen shader graph.

I expected that the view normal texture sampled and changed to world normal, is the same with world normal texture. If you need additional information, please let me know.


Solution

  • Problem is the color format of normal texture. In my case, After change the color format to ARGB Half/Float, It is working well.