Given a counter like http_requests_total
that increases over a given time range, what is the difference in calculating delta(http_requests_total[5m])
and increase(http_requests_total[5m])
?
As far as I understood the documentation, delta
calculates the difference between the start and end value of the time range. increase
calculates the rate
and then multiplies it with the time range.
But what is the actual difference? Wouldn't these two values always be the same?
Like, say I had the following values, each value is at the next second:
t0: 5
t1: 11
t2: 18
t3: 30
Then delta
would be 30 - 5 = 25
. The rate
would be the average of the individual deltas, which would be 8.333
. If I multiply this with the time range 3
, then I get 25
again. So what is actually the difference between the two?
delta
will fail when your counter will be reset (when it will start counting from 0 again), while increase
/rate
will detect that and adjust result accordingly.
So with:
t0: 5
t1: 11
t2: 28
t3: 4
t4: 40
with delta
you'll probably get 40 - 5 = 35
, while increase
will probably calculate something similar to (28-5)+40 = 63