Search code examples
gpugpgpumodelingframe-rateflops

GPU FLOPS and FPS


I am modelling a GPU (cannot disclose which) for estimating the performance of OpenCL and OpenGL applications, The model can reasonably estimate the FLOPS of the executing app/kernel/code is there a way to estimate to Frames per Second from the FLOPS, or is it better to model the framebuffer and estimate FPS from that.


Solution

  • As FPS is also influenced by the code that is running on the CPU, there's no way to make an accurate FPS prediction based on FLOPS alone.

    You have to execute the code and measure the application's FPS at runtime. Sorry!