Search code examples
videortsprtplive-streaming

Why does a video player loose some of the first frames of a live RTP stream?


I have written an RTSP server. It sends an h246/AAC stream data over RTP/UDP. The RTP send packet interval for video is 30ms, for audio is 20 ms. Timestamps are extracted from the flv-tag (My server gets video and audio data from flv file). The video player looses the first few video frames. As a result audio is ahead of video by a few seconds. Why is this? Should i do any pause before streaming on server side?


Solution

  • There are a couple of possibilities:

    • UDP is an unreliable protocol. You can check the RTP sequence numbers to see if this is the case and how many/what frames are dropped. What might help to minimise the UDP packet loss is to increase the UDP receive buffer size on the client. Here is an example of how to do it on linux. You can do the same thing on windows of course.

    • The client can only decode the video correctly once it has received the IDR frame. Until that point it can not correctly decode the video. Is the first frame you stream to a new client an IDR-frame (keeping in mind that it can still be lost).

    In any case, is seems to me like there is also another problem with your video player application: even if the frames are dropped the player is responsible for buffering and synchronising audio and video and should be able to do this irrespective of packet loss.

    On a purely informative note, you could also implement RTP/RTCP interleaved over RTSP (and hence over TCP). That way you don't have to worry about dropped frames. Libraries like the live555 media streaming lib and VLC support this.

    To answer your last question about the pause: no, that has nothing to do with it. RTSP is purely the signaling protocol. Packet loss will be occurring in the transport (UDP) layer.