Search code examples
angularasp.net-coreffmpegrtspvideo.js

Streaming RTSP (AspNet 5 API, FFMPEG, Angular 10, videoJs)


Description:

I have an API (ASP.Net 5) which connect to an IP Camera through RTSP. The camera send a h264 stream converted with ffmpeg as m3u8 stream which is returned to the angular client as follow:

public async Task<ActionResult> GetCameraH264Stream()
{
        string deviceIp = "rtsp://[CAMERA_IP]/";
        string recordingUri = "rtsp://[USER:PASSWORD]@[CAMERA_IP]/axis-media/media.amp";
        
        string output = Path.Combine(Path.GetTempPath(), Guid.NewGuid() + ".m3u8");
        var mediaInfo = await FFmpeg.GetMediaInfo(recordingUri);

        var conversionResult = FFmpeg.Conversions.New()
            .AddStream(mediaInfo.Streams)
            .SetOutput(output)
            .Start();
        
        // Allow any Cors
        Response.Headers.Add("Access-Control-Allow-Origin", "*");
        Response.Headers.Add("Cache-Control", "no-cache");
        
        // Open the file, and read the stream to return to the client
        FileStreamResult result = new FileStreamResult(System.IO.File.Open(output, FileMode.Open, FileAccess.Read, FileShare.Read), "application/octet-stream");
        result.EnableRangeProcessing = true;
        return result;
}

If I call this methods directly, the browser download a file, which I can read with VLC.

In my Angular app, I have this component:

app-vjs-player:

@Component({
       selector: 'app-vjs-player',
       template: '<video #target class="video-js" controls muted playsinline preload="none"> 
                  </video>',
       encapsulation: ViewEncapsulation.None,
    })
export class VjsPlayerComponent implements OnInit, OnDestroy {
  @ViewChild('target', {static: true}) target: ElementRef;
  
  @Input() options: {
      fluid: boolean,
      aspectRatio: string,
      autoplay: boolean,
      sources: {
          src: string,
          type: string,
      }[],
      vhs: {
        overrideNative: true
      },
  };
  player: videojs.Player;

  constructor(
    private elementRef: ElementRef,
  ) { }

  ngOnInit() {
    // instantiate Video.js
    this.player = videojs(this.target.nativeElement, this.options, function onPlayerReady() {
      console.log('onPlayerReady', this);
    });
    
  }

  ngOnDestroy() {
    // destroy player
    if (this.player) {
      this.player.dispose();
    }
  }
}

This component is used like this:

TS:

playerOptions = {
    fluid: false,
    aspectRatio: "16:9",
      autoplay: false,
      sources: [{
          src: 'https://localhost:44311/api/GetCameraH264Stream',
          type: 'application/x-mpegURL',
      }],
}

HTML:

<app-vjs-player #videoJs [options]="playerOptions"></app-vjs-player>

Problem

All this seems to work pretty well, until vjs throw this error when the api return the stream :

ERROR: (CODE:4 MEDIA_ERR_SRC_NOT_SUPPORTED) The media could not be loaded, either because the server or network failed or because the format is not supported

When I open the network dev tools, the request status is "Canceled", but I don't know if videojs cancel it because the filestreal can't be read, or if it is because of the way the API return the stream.

Any idea ?

Source

Forwarding RTSP stream from IP Camera to Browser in ASP.NET Core

VideoJs Angular integration

Xabe.FFMPEG

EDIT

  • I tried to limit the resolution and the bitrate but I can't configure the camera like that, there is other application using it. The camera do not have any streaming url allowing this configuration
  • I have been able to get an image from my code after changing the content type of the api response. I changed:
FileStreamResult result = new FileStreamResult(System.IO.File.Open(output, FileMode.Open, FileAccess.Read, FileShare.Read), "application/octet-stream");

to

FileStreamResult result = new FileStreamResult(System.IO.File.Open(output, FileMode.Open, FileAccess.Read, FileShare.Read), "application/x-mpegURL");

With this the first packet is displayed, but the next requests are still canceled.


Solution

  • The change on the response ContentType is working (see last edit on question).

    It seems that the canceled request was triggered by the slow network. All the code above is working as is, except for the last modif ( application/octet-stream => application/x-mpegURL ). Here is the updated api method:

    
    public async Task<ActionResult> GetCameraH264Stream()
    {
            string deviceIp = "rtsp://[CAMERA_IP]/";
            string recordingUri = "rtsp://[USER:PASSWORD]@[CAMERA_IP]/axis-media/media.amp";
            
            string output = Path.Combine(Path.GetTempPath(), Guid.NewGuid() + ".m3u8");
            var mediaInfo = await FFmpeg.GetMediaInfo(recordingUri);
    
            var conversionResult = FFmpeg.Conversions.New()
                .AddStream(mediaInfo.Streams)
                .SetOutput(output)
                .Start();
            
            // Allow any Cors
            Response.Headers.Add("Access-Control-Allow-Origin", "*");
            Response.Headers.Add("Cache-Control", "no-cache");
            
            // Open the file, and read the stream to return to the client
            FileStreamResult result = new FileStreamResult(System.IO.File.Open(output, FileMode.Open, FileAccess.Read, FileShare.Read), "application/x-mpegURL");
            result.EnableRangeProcessing = true;
            return result;
    }
    
    

    EDIT

    It seems that the code above will create a ffmpeg.exe process each time a request is made. This process will never end, as this is a stream from a camera that is never ended. I don't know how to kill the ffmpeg process yet, but I have modified the stream conversion retrieval so it use an existing ffmpeg process for the stream if it already exist:

    public async Task<ActionResult> GetCameraH264Stream()
    {
            string deviceIp = "rtsp://[CAMERA_IP]/";
            string recordingUri = "rtsp://[USER:PASSWORD]@[CAMERA_IP]/axis-media/media.amp";
    
            
            if (!this.cache.GetCache("camstream").TryGetValue(streamingUri, out object output)) 
            {
                    output = Path.Combine(Path.GetTempPath(), Guid.NewGuid() + ".m3u8");
                    var mediaInfo = await FFmpeg.GetMediaInfo(streamingUri);
                    var conversionResult = FFmpeg.Conversions.New()
                    .AddStream(mediaInfo.Streams)
                    .SetOutput((string) output)
                    .Start();
                    this.cache.GetCache("camstream").Set(streamingUri, output);
    
                    // Delay until the file is created
                    while (!System.IO.File.Exists((string)output))
                    {
                        await Task.Delay(100);
                    }
            }
    
            // Allow any Cors
            Response.Headers.Add("Access-Control-Allow-Origin", "*");
            Response.Headers.Add("Cache-Control", "no-cache");
    
            // Open the file, and read the stream to return to the client
            FileStreamResult result = new FileStreamResult(System.IO.File.Open(output, FileMode.Open, FileAccess.Read, FileShare.Read), "application/x-mpegURL");
            result.EnableRangeProcessing = true;
            return result;
    }
    
    

    And for the .ts file :

    
    private async Task<ActionResult> GetCameraH264StreamTSFile(string tsFileName)
    {
                string output = Path.Combine(Path.GetTempPath(), tsFileName);
                Response.Headers.Add("Access-Control-Allow-Origin", "*");
                return File(System.IO.File.OpenRead(output), "application/octet-stream", enableRangeProcessing: true);
    }