If you look at the VideoFrame
timestamp
property docs on mdn. It says it is an
integer indicating the timestamp of the video in microseconds.
What does that actually mean -- as in when does this timestamp get generated?
I've looked around and haven't found any clarifying docs or blog posts explaining. I am trying to measure latency for a video stream over webRTC and am wondering if this property can be of any use to me.
In WebCodecs docs it is said to be a presentation timestamp (PTS).
PTS is used to ensure that frames are being displayed in the correct order and at correct speed. But it is not guaranteed to correspond to any real (wall clock) time and depends on media source implementation.
There's a discussion on w3c on adding the definition of the term to the docs.
Measuring latency is rarely used for WebRTC applications as it requires precise clock synchronization between peers. It is almost impossible if you have external users. Usually, for delay estimation you would use RTT (round trip time). It is measured by WebRTC engine by default see this example.