Search code examples
webrtcrosamazon-kinesis-video-streams

WebRTC signalling succeeds but no video is coming through


I am trying to setup a video stream from a robot using WebRTC.

As far as I can tell, the signaling establishes a connection successfully, however no video is being streamed.

I use AWS Kinesis Video as the signaling server, and I use the AWS Kinesis video streams WebRTC sdk for the master node.

For the viewer, I use the Kinesis WebRTC Test Page, with the only change that the viewer does not request any audio.

I believe that the signaling works. At least the viewer is both sending and receiving ICE candidates and there are no errors.

The master node also starts to send data as expected, but the video is never displayed on the viewer.

My question is how can I debug where the problem is?

I have looked at Chrome's webrtc internals, and it generated these graphs. This shows me that packages are being received by chrome, but no frames are being decoded. Is that correct?

enter image description here

On the robot, I am running RoS as the middlelayer, and I am trying to stream the usb webcam.
To do this, I am running the h264_video_encoder node. When the nodes starts up, it tells me that it has these settings:

264 - core 152 r2854 e9a5903 - H.264/MPEG-4 AVC codec - Copyleft 2003-2017 - http://www.videolan.org/x264.html - options: cabac=1 ref=1 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=2 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=3 lookahead_threads=3 sliced_threads=1 slices=3 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=1 keyint=30 keyint_min=16 scenecut=40 intra_refresh=0 rc=abr mbtree=0 bitrate=2048 ratetol=1.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00

I suspect the most likely problem is with the encoding, but I don't know how to proceed with debugging this issue.


Solution

  • The problem that I faced was that the h264 stream I generated was missing the SPS and PPS NALs, so the viewer did not know how to decode the stream.

    The underlying problem was that I was converting from KinesisVideoFrame.msg which contained my encoded stream, but I was unaware that the frame_data did not contain the SPS and PPS packets. Those packets are stored in codec_private_data, so I had to prepend codec_private_data to the frame_data to get the final frame data to be send over the stream