I would like to display vertex normals as lines originating at the vertex and extending in the same direction as the normal for a distance based on a specified scale.
The easiest way i can see of doing this is to calculate the start and end points for each normal in model space on the CPU, using the inverse transpose of the model view matrix to transform each normal into view space, and draw them as lines on the GPU.
Is there a better way to do this? I thought of passing the vertex position and normal into the shader so that it could calculate the line positions, but this seems more complex and i believe would require a branch in the shader.
In OpenGL ES 2.0 this will definitely not be possible without even a bit of additional computation inside the CPU. You cannot just render all your vertices and magically create a line for each single vertex. So even if computing the vertex coordinates inside the vertex shader based on the normal you still have to put in a line (thus two vertices) for each original vertex of your mesh. So this would at least require some restructuring of your vertex data.
You are also correct that this would require you to somehow disambuigate between the starting vertex and the end vertex of a line inside the shader. But you don't neccessarily need a branch, just set the normal of the starting vertex to all 0s (assuming you don't need the normal for anything else when rendering the line, but lines usually aren't lit anyway):
CPU:
for each original vertex:
generate two line vertices with attributes of original vertex
set normal attribute of first line vertex to 0-vector
...
render all lines using following shader
GPU:
attribute vec3 position;
attribute vec3 normal;
...
void main()
{
...
vec3 modelPos = position + normal;
gl_Position = projMat * modelviewMat * vec4(modelPos, 1.0);
}
But I don't think this would really buy you anything, because you somehow have to tell the shader about the difference between both line endpoints and once you do this you need to generate two distinct vertices for each line on the CPU. And once you do this, you can just change the second line vertex' coordinates based on the normal instead of doing the computation in the shader. That little vec3-saxpy shouldn't really cost you much. But maybe the shader approach is faster, try it.
The situation would be different in desktop GL 3+, where you can use the geometry shader. You just render all original vertices as points and in the geometry shader generate a line for each point computing its coordinates based on the point's normal. Then there is really no need for any kind of CPU computation. But unfortunately ES 2 doesn't have a geometry shader (don't know about ES 3, though).