Search code examples
javaopengl-esarmfragment-shaderimx6

Fragment shader behaves unexpectedly (Test grid)


I'm trying to write a simple fragment shader to display a grid (or rather a checker pattern) on a polygon. I want this pattern to "remain in place", i.e. when the polygon itself moves, the squares remain in the same place, so the resulting pattern kind of slides on the surface of the pattern.

I'm developing this in Java using LWJGL for an ARM-based embedded system, and I can debug both on remotely the ARM bevice connected to my PC, as well as locally on the PC itself. I use intelliJ for this.

On PC my program defaults to using OpenGL 3.2. On ARM the context is OpenGL ES 3.0. The graphics card on ARM is Vivante GC 2000.

Here's the problem: locally, on my PC, the shader works flawlessly, just like I want it to. But when I go to ARM - the pattern is jittering, distorting and going out of sync between two triangles that make my polygon. The interesting fact is that the pattern changes and moves based on camera position, even though the shader uses only ModelMatrix and vertex positions of the plane for calculations, which both remain exactly the same between frames (I checked). Yet camera position somehow affects the result dramatically, which shouldn't happen.

Here's my vertex shader:

#version 300 es

layout (location=0) in vec3 position;

uniform mat4 projectionMatrix;
uniform mat4 modelViewMatrix;
uniform mat4 modelMatrix;

out highp vec3 vertexPosition;

void main()
{   
    // generate position data for the fragment shader
    // does not take view matrix or projection matrix into account
    vec4 vp = modelMatrix * vec4(position, 1.0);
    vertexPosition = vec3(vp.x, vp.y, vp.z);

    // position data for the OpenGL vertex drawing
    gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}

fragment shader:

#version 300 es

precision highp float;

in highp vec3 vertexPosition;

out mediump vec4 fragColor;

void main()
{
    highp float c = float((int(round(vertexPosition.x/5.0))+int(round(vertexPosition.z/5.0))) % 2);

    fragColor = vec4(vec3(c/2.0 + 0.3), 1);
}

As you can see I've tried tinkering with precision of float operations, alas to no avail. You can also notice that only modelMatrix and Vertex positions of the polygon affect fragColor, and I can guarantee that I cnecked them, and they do not change between shader calls, yet somehow camera movement ends up affecting the resulting fragment colors/pattern.

It's also worth of note that no other textures on the objects in the scene seem to be affected by the issue

Here're a couple of screenshots:

How it looks locally (everything works):

enter image description here

Here's how it looks on the ARM device. enter image description here

Notice the textures shifted between triangles, and there's a weird line between them that seems to have been filled by a completely different set of rules entirely. The problem doesn't appear at all viewing angles - only some. If I point the camera in other directions, sometimes I can move it rather freely with no artifacts visible.

The other thing I've noticed is that the bigger my polygon is, the more jittering and artifacting occurs, which leads me to believe that it has to do with precision/calculation of either vertex positions (in vertex shader), or the position of fragment in relation to those vertex positions in the fragment shader part.

Edit: I've checked the precision of float using glGetShaderPrecisionFormat(GL_FRAGMENT_SHADER, GL_HIGH_FLOAT, range, precision); and it's the same on both local PC and MTU. So it shouldn't be that, unless you have to somehow specifically enable some flag that I'm missing.

Edit: And yet another thing I noticed is that locally, on PC the little test grass block appears in the center of one of the squares. And on ARM the squares are shifted by half, so it stands directly on the intersection (if I line the camera so that artifact doesn't happen). Can't rightly explain this, because in my mind the calculation should yield the same result.

Either way, I actually need to solve this problem somehow, and I would appreciate the answer.


Solution

  • I'll post my final answer to this problem, something that took a lot of struggle, searching and trial and error. There are actually two separate issues depicted on the screenshots, so I'll cover them both.

    Regarding the strange texture shift on the polygon intersection. It has been confirmed, that this is a Vivante driver issue. Coordinates for points that lie too far outside of frustum are calculated wrongly for fragment shader (note that they are completely fine in the vertex shader, hence the plane doesn't appear torn - only the texture suffers).

    There doesn't seem to be a driver fix at the moment.

    You can, however, implement a workaround. Split the mesh. Instead of having a large quad that's made of 2 triangles - build it from several smaller quads. In my case I made a 6x6 structure, 36 quads and 64 triangles total. That way, no points are ever too far outside of frustum and precision seems good.

    Yes, this is FAR from ideal, but it's better than having your fragment shader produce visual artifacts.


    Concerning the colors. As you may notice, on the screenshots the colors end up as grey and black, where they should be light grey and dark grey.

    The solution wasn't easy to arrive to. It's system locale.

    As some of you may not even be aware of, on some Slavic locales the delimiter sign is comma, not a dot. so basically the line:

    fragColor = vec4(vec3(c/2.0 + 0.3), 1);
    

    gets turned into

    fragColor = vec4(vec3(c/2 , 0 + 0 , 3), 1);
    

    As you can guess, this is completely and utterly wrong. I'm actually impressed that GLSL seems completely fine with this and doesn' give any runtime errors. Probably because vec3 can take 3 coordinates during creation, or something. In any case, this bug made all floating point constants in my code completely wrong, and the resulting calculation is also wrong.

    Just in case anyone runs into that problem.