How do we get the performance of a video decoder as to how many frames it can decode per second. I know following parameters are used to arrive at fps but not able to relate them in a formula which gives the exact answer:
seconds taken to decode a video sequence, total number of frames in the encoded video sequence, clock rate of the hardware/processor which executes the code, Million cycles per second(MCPS) of the decoder
How is MCPS and fps related?
Given the calculation of Byron. I think it should be more in the lines of:
A file F to be encoded which consists of N frames takes T seconds to be encoded on a processor which can do X MCPS
than I would say the encoder uses: (T*X)/N MC(million cycles) per frame
given that the framerate is F (for instance 25 frames a second)
than the above value times F gives the used MCPS for the encoder.
if this is lower than the MCPS of your processor, you can encode realtime (or faster).
R