Search code examples
node.jshtmlwebsockethttp-live-streaming

Accessing live video stream midway using websockets


I am using the combination of fragmented mp4 and websockets to stream a live video stream to web browser from where MSE takes over.

I have successfully fragmented into the appropriate fmp4 format using ffmpeg and have checked the data using an mpeg4parser tool. Utilising a websocket server, the incoming data is broadcasted to all the browser clients connected via websocket. This works fine for both playback and live streaming(using rtsp stream as the input).

The problem I am facing occurs when a client tries to access the stream midway, i.e, once the ffmpeg stream has started. I have saved the init segment(ftyp + moov) elements in a queue buffer in the websocket server. This queue buffer sends this data to each new client on connection.

I believe this data is sent correctly since the browser console does not throw the 'Media Source Element not found' error. Yet no video is streamed when it receives the broadcasted moof/mdat pairs.

So a couple of questions I would like the answer to are:

1) I have observer that each moof element contains a sequence number in it's mfhd child element. Does this have to start from 1 always, which will naturally not be the case for a video stream accessed midway?

2) Is it possible to view the data in the browser client.js. At present all I can view is that my mediaBuffer contains a bunch of [Object ArrayBuffer]. Can I print the binary data inside these buffers?

3) From the server side the data seems to be sent in moof/mdat fragments as each new data arriving from the ffmpeg output to the websocket server begins with a moof element. This was noticed by printing the binary data in console. Is there a similar way to view this data in client side.

4) Does anyone have an idea of why this is happening? Some fragmented mp4 or ISO BMFF format detail that I am missing.

If any further detail is required for clarification please let me know, I will provide it.


Solution

  • Make sure your fragments include a base media decode time. Then set the video tag 'currentTime' to the time of the first fragment received.