Search code examples
iosshaderscenekitfragment-shadernormals

SceneKit sphere normals appear to change as the camera moves?


The normals of an SCNSphere in the following example appear to change as the camera rotates, which doesn't seem right and suggests I'm misunderstanding something fundamental.

In a scene with cylinders placed to highlight the x, y, and z axis (colored red, green, and blue respectively) and sphere with unit radius and the following custom fragment shader modifier:

_output.color.rgb = _surface.normal;

a camera at position (0, 0, 10) (i.e. along the z-axis) renders the following:

enter image description here

This looks right, since normals projecting outward at the x, y, and z axes are red, green, and blue respectively, with the expected gradations throughout. But, if we were to then rotate the camera to position (10, 0, 0) (i.e. along the x axis), this is the rendered scene:

enter image description here

which suggests that the normals have changed, as the gradation is no longer corresponds the expected directions in the coordinate space (in fact, this color pattern holds for any rotation of the camera).


Solution

  • From the documentation:

    Geometric fields (such as position and normal) are expressed in view space. You can use SceneKit’s uniforms (such as u_inverseViewTransform) to operate in a different coordinate space, but you must convert back to view space before writing results.

    So the normal is expressed in the coordinate system of the camera, it's not in object or world space.