I am streaming from a MJPEG media server over RTSP
OpenCV has a function called cvGetTickCount()
My question is: Does this method return the difference in RTP timestamps (from the rtp header) between frames? Or does OpenCV just look at the FPS and tick frequency and return a constant value each time?
When I print the results from cvGetTickCount() the values look too perfect. I recall previously manually decoding a RTSP stream and getting results with some variance
I am struggling to find any proper documentation about this method
"cv2.getTickCount function returns the number of clock-cycles after a reference event (like the moment machine was switched ON) to the moment this function is called.", see Reference. You can use this function you measure the runtime based on the system clock. But this has nothing to do with the RTSP stream timestamp. I would suggest the VLC api or ffmpeg api to get that timestamp.