I'm trying to use OpenDJ's searchrate
command to measure LDAP's performance and response time to queries.
The output looks as follows:
Throughput Response Time
(ops/second) (milliseconds)
recent average recent average 99.9% 99.99% 99.999% err/sec Entries/Srch
-------------------------------------------------------------------------------
2260.7 2262.8 6.219 6.219 413.842 476.425 476.817 0.0 1.5
3857.5 3178.3 4.149 4.777 352.825 476.404 500.017 0.0 0.8
5078.2 3753.2 2.825 3.978 360.940 460.154 500.017 0.0 1.0
4557.7 3934.5 3.411 3.830 352.480 455.638 500.017 0.0 1.0
It's unclear to me what each column represents. Please help by providing a clarification for each column meaning.
Looks like to me the header is pretty clear.
The first 2 columns represent throughput, in operations per second, one for the last period, and an average since the start.
The next 5 columns represent response times in milliseconds. Computed on last period, average since start, and then there are 3 percentiles that allows to understand the outliers. The value 352.480 means that 99.9% of requests average in 352ms response time, whereas 99.999% of the request average in 500.017ms. Percentiles are important to understand the distribution of response times.
Finally, the last 2 columns represent the number of errors per sec (should be 0) and the number of entries returned for each search (should be 1 or vary depending on the search and the data).