Search code examples
openglglslnoise

GLSL 3D Noise Implementation on ATI Graphics Cards


I have tried so many different strategies to get a usable noise function and none of them work. So, how do you implement perlin noise on an ATI graphics card in GLSL?

Here are the methods I have tried: I have tried putting the permutation and gradient data into a GL_RGBA 1D texture and calling the texture1D function. However, one call to this noise implementation leads to 12 texture calls and kills the framerate.

I have tried uploading the permutation and gradient data into a uniform vec4 array, but the compiler won't let me get an element in the array unless the index is a constant. For example:

int i = 10;
vec4 a = noise_data[i];

will give a compiler error of this:

ERROR: 0:43: Not supported when use temporary array indirect index.

Meaning I can only retrieve the data like this:

vec4 a = noise_data[10];

I also tried programming the array directly into the shader, but I got the same index issue. I hear NVIDIA graphics cards will actually allow this method, but ATI will not.

I tried making a function that returned a specific hard coded data point depending on the input index, but the function, being called 12 times and having 64 if statements, made the linking time unbearable.

ATI does not support the "built in" noise functions for glsl, and I cant just precompute the noise and import it as a texture, because I am dealing with fractals. This means I need the infinite precision of calculating the noise at run time.

So the overarching question is...

How?


Solution

  • noise() is well-known for not beeing implemented...

    roll you own :

    int c;
    int Xn;
    srand(int x, int y, int width){// in pixel
        c = x+y*width;
    };
    
    int rand(){
        Xn = (a*Xn+c)%m;
        return Xn;
    }
    

    for a and m values, see wikipedia

    It's not perfect, but often good enough.