Search code examples
opengloptimizationtimelag

GLSL time for function in shader


I would like to know if there is a way to calculate the time (or number of operations) that a function take in glsl program?
Being quite new to glsl and the way GPU works, it's hard and take me time to optimize a glsl shader; and my multipass rendering is very laggy. So my goal would be to focus more on slower function. Does a thing could help me?

I'm working on VS2015, and sadly, my GPU doesn't allow NSight to works.


Solution

  • Shaders run paralelized in GPU. You can't find the number of operations per shader, because you really don't know how many "gpu-cores" are running and how the gpu-compiler optimized the shaders.

    You can measure the time ellapsed for a draw command. See more for example here, here and here