Search code examples
javascriptthree.jspoint-cloudsbuffer-geometry

three.js point clouds, BufferGeometry and incorrect transparency


The problem: I have a point cloud with quite a lot of data points (around one million). When I apply transparency to the rendered points, the transparency somehow does not show what is behind the rendered points

incorrect rendering

As you can see in the example of the marked point, it does not show what it should, it is as if there is a problem with the buffering.

I use three.js to create a point cloud using the following "setup":

The renderer:

this.renderer = new THREE.WebGLRenderer({
    canvas: this.canvas,
    antialias: true
});

The material:

this.pointMaterial = new THREE.ShaderMaterial( {
    uniforms: {
        time:       { type: "f", value: 1.0 }
    },
    vertexShader:   document.getElementById('vertexShader').textContent,
    fragmentShader: document.getElementById('fragmentShader').textContent,
    transparent:    true
});

The vertex shader:

attribute float size;
attribute float opacity;
attribute vec3 color;
varying vec3 vColor;
varying float vOpacity;

void main() {
    vColor = color;
    vOpacity = opacity;
    vec4 mvPosition = modelViewMatrix * vec4(position, 1.0);
    gl_PointSize = size * (500.0 / length(mvPosition.xyz));
    gl_Position = projectionMatrix * mvPosition; 
}

The fragment shader:

uniform float time;
varying vec3 vColor;
varying float vOpacity;

void main() {
    gl_FragColor = vec4(vColor, vOpacity);
}

The geometry (where I left out the part where I populate the arrays):

var bufferGeometry = new THREE.BufferGeometry();

var vertices = new Float32Array(vertexPositions.length * 3);
var colors = new Float32Array(vertexColors.length * 3);
var sizes = new Float32Array(vertexSizes.length);
var opacities = new Float32Array(vertexOpacities.length);

bufferGeometry.addAttribute('position', new THREE.BufferAttribute(vertices, 3));
bufferGeometry.addAttribute('color', new THREE.BufferAttribute(colors, 3));
bufferGeometry.addAttribute('size', new THREE.BufferAttribute(sizes, 1));
bufferGeometry.addAttribute('opacity', new THREE.BufferAttribute(opacities, 1));

this.points = new THREE.Points(bufferGeometry, this.pointMaterial);
this.scene.add(this.points);

I tried this with the built-in point material, where the same happens

this.pointMaterial = new THREE.PointsMaterial({
    size: this.pointSize,
    vertexColors: THREE.VertexColors,
    transparent: true,
    opacity: 0.25
});

Is this a but, expected behaviour or am I doing something wrong?


Solution

  • The way the alpha blending equation works is that a source colour for geometry that is behind is covered by a destination colour for geometry that is in front. This means you need to render your transparent geometry in sorted order from back to front, so that geometry in front will correctly blend with geometry behind.

    If all you have is transparent geometry then you can just disable depth testing, render in reverse depth sorted order, and it will work. If you have opaque geometry as well then you need to first render all opaque geometry normally, then disable depth writing (not testing) and render transparent geometry in reverse depth sorted order, then re-enable depth writing.

    Here are some answers to similar questions if you're interested in learning a bit more.