Search code examples
shaderwebgledge-detection

WebGL-How to get the normals data of the texture for post-processing of edge detection


When I was looking for the WebGL edge detection method,
I saw that Unity has a solution. Use Camera-DepthNormalTexture.shader to write the depth and normals data to the texture.
I will use the Roberts operator to calculate whether the obtained data is an edge,
but WebGL i can only get depthTexture in framebuffer,
can not get depthNormalTexture like in unity,
I tried to write a test in WebGL,
only use the depthtexture obtained in framebuffer,
but only detect edges with huge depth differences,
and Cannot detect all the edges of the solid color model.
I want to know if there are other ways to solve this kind of problem.

[(solid color)WebGL test rendering pic][1]

I set the normal data of the vertex in advance

Using the shader provided in gman's answer, get the texture of normals data in framebuffer in a new rendering. The result seems to draw edges, but the depth seems to be wrong

framebuffer = gl.createFramebuffer();
// rgb texture
gl.bindTexture(state.gl.TEXTURE_2D, texture);
// ... texImage2D  texParameteri set up
// depth texture
gl.bindTexture(state.gl.TEXTURE_2D, depthTexture);
// ... texImage2D  texParameteri set up


// Create a with normals data FBO
normalFramebuffer = gl.createFramebuffer();
gl.bindTexture(gl.TEXTURE_2D, normalTexture);
// ... texImage2D  texParameteri set up


gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffer);
 obj.render('rgbaData shader');
gl.bindFramebuffer(gl.FRAMEBUFFER, normalFramebuffer);
 obj.render('normalsData shader');
gl.bindFramebuffer(gl.FRAMEBUFFER, null);
gl.useProgram('my edge detection shader');
gl.uniform1i(texture, 0);
gl.uniform1i(depthTexture, 1);
gl.uniform1i(normalTexture, 2);

Solution

  • Unity generates the texture with normals even in WebGL so this is not a limit of WebGL. If you want a texture with normals then render a texture with normals.

    Assuming you have geometry data with normals then you'd use a vertex shader something like

    attribute vec4 position;
    attribute vec3 normal;
    
    uniform mat4 projection;
    uniform mat4 modelView;
    uniform mat4 normalMatrix;
    
    varying vec3 v_normal;
    
    void main() {
      gl_Position = projection * modelView * position;
      v_normal = mat3(normalMatrix) * normal;
    }
    

    and a fragment shader like

    precision mediump float;
    varying vec3 v_normal;
    void main() {
      gl_FragColor = vec4(normalize(v_normal) * 0.5 + 0.5, gl_FragCoord.z);
    }
    

    And render your shapes to a texture via a framebuffer. You then have a texture with normals and depth in it which you can pass to your check edge shader

    You can read more about shaders with normals here