Search code examples
javajmeterperformance-testingsplunk

Difference in Splunk and JMeter percentile


I did performance testing for web service and in my service i calculated timeTaken using java and logged in splunk.

I am comparing the splunk and jmeter reports and see the difference in Average,Median,90%Line,95%Line,99%Line,Min and Max

Is this difference expected in splunk and Jmeter report?


Solution

  • My expectation is that you're comparing different things. I have no idea regarding how do you calculate timeTaken and feed it to Splunk, however I think the following happens:

    1. JMeter sends a request
    2. Here JMeter measurement begins
    3. Request travels to the application under test
    4. Application under test dispatches it to the relevant endpoint
    5. Here your measurement begins
    6. Application under test processes the request and prepares the response
    7. Here your measurement ends
    8. Application server sends the response back to JMeter
    9. JMeter measures time to first byte 10 JMeter measures time to last byte

    So in Splunk you have only timing for point 6 and JMeter reflects timings for the whole sequence including the time to travel back and forth (check out Connect Time and Latency metrics).

    According to JMeter Glossary

    Elapsed Time = Connect Time + Latency
    

    So you need to subtract Connect Time from what you see in Splunk and the result should be closer.