Search code examples
c++timedirectxmeasurement

Measuring time on GPU


I am currently investigating the time spent of some antialiasing methods. I am not entirely sure how to accurately calculate the shader computational time. Am I correct in assuming that if I make a draw call, the data is sent to the GPU. The CPU continues to run, and then wait for the GPU to be done at the swapchain. So accurately calculating the time would be "StartTimer" before RenderAntialiasing() and "StopTimer" after swapChain->Present(0, 0); ?

My results would otherwise be very odd.


Solution

  • This rather dated article gives you some idea of some of the challenges there are in doing performance timing for graphics.

    All a CPU-based timer can tell you is the CPU time it takes for an operation, but due to batching it probably has little correlation to the API functions you are bracketing. As such, you should not try to 'micro-benchmark' the CPU costs of rendering, but instead bracket most of a frame with QueryPerformanceCounter.

    To get GPU timing, you use Direct3D timestamp queries. For Direct3D 11, see D3D11_QUERY_TIMESTAMP_DISJOINT and D3D11_QUERY_TIMESTAMP.