Search code examples
ffmpegstreamingffserver

Send 2 different camera feeds to ffserver from ffmpeg


I'm currently working in a project where I have to stream 2 webcams streams from a computer to another through a TCP connection, I can stream 1 without problem:

using

ffserver.conf:

            HTTPPort 8090
            HTTPBindAddress 0.0.0.0
            MaxClients 40
            MaxBandwidth 30000

            CustomLog -
            NoDaemon

            <Stream status.html>
            Format status
            ACL allow localhost
            ACL allow 192.168.0.0 192.168.255.255
            </Stream>

           #feed for camera 1
           <Feed webcam1.ffm>
           File /tmp/webcam1.ffm
           FileMaxSize 100M
           </Feed>

           #feed for camera 2
           <Feed webcam2.ffm>
           File /tmp/webcam2.ffm
           FileMaxSize 100M
           </Feed>

          #stream for feed 1
          <Stream webcam1.mjpeg>
          Feed webcam1.ffm
          Format mjpeg
          VideoSize 1280x720
          VideoFrameRate 30
          Preroll 0
          NoAudio
          Strict -1
          </Stream>

          #stream for feed2
          <Stream webcam2.mjpeg>
          Feed webcam2.ffm
          Format mjpeg
          VideoSize 1280x720
          VideoFrameRate 30
          Preroll 0
          NoAudio
          Strict -1
          </Stream>

command to run ffserver:

          ffserver /etc/ffserver.conf

command to feed ffserver:

         ffmpeg -v 2 -r 20 -f video4linux2 -i /dev/video0 http://localhost:8090/webcam1.ffm

and it works perfect, but when I try to run the other feed:

         ffmpeg -v 2 -r 20 -f video4linux2 -i /dev/video1 http://localhost:8090/webcam2.ffm

I can see just the second stream and the first one do not work anymore. some idea?


Solution

  • Using multiple USB webcams simultaneously may saturate the bus. This seems to be your case since starting the second camera cuts the first one off.

    The situation has improved from when USB1.1 was common. Most even low-end motherboards have multiple USB2/3 controllers, which are entirely independent and can run multiple cameras without concern. USB2 can support multiple cameras at low resolution and framerate. High framerate high resolution cameras sending uncompressed images may still saturate the bus

    Source

    Possible solutions:

    1. Switch to MJPEG input (lower bandwidth use)

    Check the capabilities of your device:

    ffmpeg -f v4l2 -list_formats all -i /dev/video0

    If it supports MJPEG then use it instead of raw video:

    ffplay -f v4l2 -input_format mjpeg -i /dev/video0 ...

    1. Use a different USB controller for the second camera

    If the motherboard doesn't feature multiple controllers then get a PCI USB card.