The example is in R, but really it is a mathematics question.
I have a video. I have the times of a few successive frames (tms
). They are in number of seconds after the video started.
tms <- c(0.06, 0.07, 0.26, 0.30, 0.33, 0.38, 0.47)
I want to calculate the average frame rate of the video.
I can think of two ways, and I cannot decide which one makes more sense. They do not give the same result, so surely one is inaccurate!
Method 1
mean(1/diff(tms))
32.45127
Method 2
1/mean(diff(tms))
14.63415
Which one is correct and why is the other one not?
Method 2 is correct. Framerate is typically expressed as FPS, that is:
Total number of frames / total number of seconds
We can see the equivalence with this:
tms <- c(0.06, 0.07, 0.26, 0.30, 0.33, 0.38, 0.47)
# total number of seconds
n_seconds <- sum(diff(tms))
# total number of frames
n_frames <- length(tms) - 1
# fps
n_frames / n_seconds
[1] 14.63415
Note that in the code above I set n_frames
as length(tms)-1
because the diff
function always produces a vector with one less entry than the original vector (it is a differencing tool). You could add 0 to your tms
vector at the start.