Search code examples
bashshellffmpegbroadcaststreamlink

fallback input when blackscreen ffmpeg


I would like to have a "fallback" video when using the ffmpeg command. The goal is that I broadcast a video and once this one displays an almost black screen, there should be a switch to the fallback, and re-switch when the black screen is over. I know that we can detect the "blackframe" but I'm lost then.. If anyone have an idea ?

I'm at this point for the moment (I'm using streamlink for getting the input) :

streamlink <<THE_URL>> best -O | ffmpeg -re -i pipe:0 -c:v libx264 -vf "blackframe=amount=40:thresh=5" -c:a aac -strict -2 -f flv <<RTMP_URL>> | grep blackframe

thank you


Solution

  • We may solve the problem using geq, alphamerge and overlay filters:

    streamlink <<THE_URL>> best -O | ffmpeg -i pipe: -f lavfi -i smptebars=size=384x216:rate=30 -filter_complex "[1:v][0:v]scale2ref[v1][v0];[v0]format=gray,geq=lum_expr='lt(p(X,Y), 5)',geq='gt(lumsum(W-1,H-1),0.4*W*H)*255'[alpha];[v1][alpha]alphamerge[alpha_v1];[0:v][alpha_v1]overlay=shortest=1[v]" -map "[v]" -map 0:a -c:v libx264 -pix_fmt yuv420p -c:a aac -strict -2 -f flv <<RTMP_URL>>

    The above example uses smptebars synthetic pattern as fallback video.
    scale2ref is used for adjusting the resolution of the pattern to the resolution of the main input video.
    For best results, the rate parameter (e.g rate=30) should match the framerate of the main video.

    We may choose to replace the synthetic pattern with fallback from a video file - the fallback video should be long enough (or played in a loop using stream_loop argument), and the framerate better match the main input.

    Note:

    • Using blackframe is no going to work, because the list of the detected black frames is printed, and cannot be used to modify the main video "on the flight" (the list may be used in a second pass for offline processing).

    Filters chain explanation:

    • [1:v][0:v]scale2ref[v1][v0] - Scales the fallback video to the resolution of the main input video.
    • [v0]format=gray,geq=lum_expr='lt(p(X,Y), 5)[alpha]' - Replace all pixels below the threshold 5 with the value 1, and set pixels above 5 to 0.
    • geq='gt(lumsum(W-1,H-1),0.4*W*H)*255' - Set all the pixels in the relevant frame to 255 if sum of 1 pixels above 40% of the total pixels (0.4*W*H applies 40% of total pixels).
      If 40% of the total pixels are below 5, the [alpha] is black (all zeros).
      If 40% or more of the total pixels are above 5, the [alpha] is white (all 255).
    • [v1][alpha]alphamerge[alpha_v1] - Merge [alpha] as alpha channel to the scaled fallback frame [v1].
      If [alpha] is black, the [alpha_v1] is fully transparent.
      If [alpha] is white, the [alpha_v1] is fully opaque.
    • [0:v][alpha_v1]overlay=shortest=1[v] - Overlays [alpha_v1] on top of the main video frame.
      If [alpha_v1] is transparent, the main video frame is unmodified.
      If [alpha_v1] is opaque, the main video frame is replaced with the fallback frame.

    Testing:

    Create synthetic video with frames that gets darker along time:

    ffmpeg -y -f lavfi -i testsrc=size=192x108:rate=1:duration=10 -filter_complex "[0:v]format=rgb24[v0];[v0][0:v]blend=all_mode='overlay':all_expr='max(A*0.5-T*15\,0)',format=rgb24[v]" -map "[v]" -vcodec libx264 -pix_fmt yuv420p in.mp4

    The video is 10 frames, the last two frames are considered to be "black frames".

    enter image description here enter image description here ...

    enter image description here enter image description here


    Executing blackframe filter (used as reference):

    ffmpeg -y -hide_banner -loglevel info -i in.mp4 -vf "format=gray,blackframe=amount=40:thresh=5" -c:v libx264 -pix_fmt yuv420p out.flv

    The two last frames are marked as "black frame":

    [Parsed_blackframe_1 @ 00000285e905fc40] frame:8 pblack:52 pts:131072 t:8.000000 type:P last_keyframe:0
    [Parsed_blackframe_1 @ 00000285e905fc40] frame:9 pblack:100 pts:147456 t:9.000000 type:P last_keyframe:0


    Marking pixels below 5 as white pixels (test format=gray,geq=lum_expr='lt(p(X,Y), 5)):

    ffmpeg -y -hide_banner -i in.mp4 -vf "format=gray,geq=lum_expr='lt(p(X,Y), 5)*255'" -c:v libx264 -g 1 -pix_fmt yuv420p out.flv

    The first two frames have small amount of dark pixels, and the last two have lots of dark pixels (the last image is white):

    enter image description here enter image description here ...

    enter image description here enter image description here


    Testing geq='gt(lumsum(W-1,H-1),0.4*W*H):

    ffmpeg -y -hide_banner -i in.mp4 -vf "format=gray,geq=lum_expr='lt(p(X,Y), 5)',geq='gt(lumsum(W-1,H-1),0.4*W*H)*255'" -c:v libx264 -g 1 -pix_fmt gray out.flv

    The first 8 frames are black, and the last 2 frames are white:

    enter image description here enter image description here ...

    enter image description here enter image description here


    Testing the complete filter chain:

    ffmpeg -y -hide_banner -i in.mp4 -f lavfi -i smptebars=size=384x216:rate=1 -filter_complex "[1:v][0:v]scale2ref[v1][v0];[v0]format=gray,geq=lum_expr='lt(p(X,Y), 5)',geq='gt(lumsum(W-1,H-1),0.4*W*H)*255'[alpha];[v1][alpha]alphamerge[alpha_v1];[0:v][alpha_v1]overlay=shortest=1[v]" -map "[v]" -c:v libx264 -g 1 -pix_fmt yuv420p out.flv

    The last two frames are replaced with the fallback video:

    enter image description here enter image description here ...

    enter image description here enter image description here


    Note:

    • The above solution may not work in real-time, because the filter chain may be too computationally intensive.
      For reducing computation time, we may downscale the input video, apply the stages for producing alpha over a shrunken frame, and upscale the alpha before alphamerge.

    Testing a live stream from twitch.tv (piping to FFplay):

    streamlink https://www.twitch.tv/ninja --default-stream 480p -O | ffmpeg -i pipe: -f lavfi -i smptebars=size=384x216:rate=30 -filter_complex "[1:v][0:v]scale2ref[v1][v0];[v0]format=gray,geq=lum_expr='lt(p(X,Y), 5)',geq='gt(lumsum(W-1,H-1),0.4*W*H)*255'[alpha];[v1][alpha]alphamerge[alpha_v1];[0:v][alpha_v1]overlay=shortest=1[v]" -map "[v]" -map 0:a -c:v libx264 -pix_fmt yuv420p -c:a aac -strict -2 -f flv pipe: | ffplay pipe:

    In may machine, 480p quality is working, but with 720p60 the CPU utilization in 100%, and the video stutters.

    enter image description here