I did performance testing for web service and in my service i calculated timeTaken using java and logged in splunk.
I am comparing the splunk and jmeter reports and see the difference in Average,Median,90%Line,95%Line,99%Line,Min and Max
Is this difference expected in splunk and Jmeter report?
My expectation is that you're comparing different things. I have no idea regarding how do you calculate timeTaken
and feed it to Splunk, however I think the following happens:
So in Splunk you have only timing for point 6 and JMeter reflects timings for the whole sequence including the time to travel back and forth (check out Connect Time and Latency metrics).
According to JMeter Glossary
Elapsed Time = Connect Time + Latency
So you need to subtract Connect Time from what you see in Splunk and the result should be closer.