Search code examples
c#uwpraspberry-pivideo-streaminglibvlcsharp

Best way to stream low latency video from a Raspberry Pi to an UWP-APP


For a project, I have to communicate with a Raspberry Pi Zero from a UWP-APP via TCP. Because both, the Raspberry and the computer with the interface, have got a private IP, I have to use a server to forward messages from one client to the other one. This part already works but now my problem is that I have to implement video streaming from the Raspberry to the UWP-APP.

Because my partner is in charge of creating and designing the UWP-APP, I have made myself a little Test-Interface with WindowsForms. I have tried several techniques like Netcat the video output over the server to the client or direct TCP-streaming with raspivid, but the best solution so far is the one I found in this project here. But instead of using the Eneter.Messaging-library I use my own class for communication with TcpClients.

I use mono to run my C# script on the Raspberry and the code to stream the Video looks like this:

while (true)
            {
                //Wait with streaming until the Interface is connected
                while (!RemoteDeviceConnected || VideoStreamPaused)
                {
                    Thread.Sleep(500);
                }
                //Check if Raspivid-Process is already running
                if(!Array.Exists(Process.GetProcesses(), p => p.ProcessName.Contains("raspivid")))
                    raspivid.Start();
                Thread.Sleep(2000);
                VideoData = new byte[VideoDataLength];
                try
                {
                    while (await raspivid.StandardOutput.BaseStream.ReadAsync(VideoData, 0, VideoDataLength) != -1 && !VideoChannelToken.IsCancellationRequested && RemoteDeviceConnected && !VideoStreamPaused)
                    {
                        // Send captured data to connected clients.
                        VideoConnection.SendByteArray(VideoData, VideoDataLength);
                    }
                    raspivid.Kill();
                    Console.WriteLine("Raspivid killed");
                }
                catch(ObjectDisposedException)
                {

                } 
            }

Basically, this method just reads the h264 data from the Standard-Output-Stream of the raspivid process in chunks and sends it to the server.

The next method runs on the server and just forwards the byte array to the connected interface-client.

while (RCVVideo[id].Connected)
            {
                await RCVVideo[id].stream.ReadAsync(VideoData, 0, VideoDataLength);
                if (IFVideo[id] != null && IFVideo[id].Connected == true)
                {
                    IFVideo[id].SendByteArray(VideoData, VideoDataLength);
                }
            }

SendByteArray() uses the NetworkStream.Write() Method.

On the interface, I write the received byte[] to a named pipe, to which the VLC-Control connects to:

while (VideoConnection.Connected)
            {
                await VideoConnection.stream.ReadAsync(VideoData, 0, VideoDataLength);
                if(VideoPipe.IsConnected)
                {
                    VideoPipe.Write(VideoData, 0, VideoDataLength);
                }
                
            }

Following code initializes the pipe-server:

// Open pipe that will be read by VLC.
        VideoPipe = new NamedPipeServerStream(@"\raspipipe",
                                                PipeDirection.Out, 1,
                                                PipeTransmissionMode.Byte,
                                                PipeOptions.WriteThrough, 0, 10000);

And for VLC:

LibVLC libVLC = new LibVLC();

        videoView1.MediaPlayer = new MediaPlayer(libVLC);

        videoView1.MediaPlayer.Play(new Media(libVLC, @"stream/h264://\\\.\pipe\raspipipe", FromType.FromLocation));
        videoView1.MediaPlayer.EnableHardwareDecoding = true;
        videoView1.MediaPlayer.FileCaching = 0;
        videoView1.MediaPlayer.NetworkCaching = 300;

This works fine on the Windowsforms-App and I can get the delay down to 2 or 3 seconds (It should be better in the end but it is acceptable). But on the UWP-App I can't get it to work even after adding /LOCAL/ to the pipe name. It shows that the VLC-Control connects to the pipe, and I can see that data is written to the pipe but it doesn't display video.

So my question is:

How can I get this to work with the VLC-Control (LibVLCSharp) in UWP? Am I missing something fundamental?

Or is there even a better way to stream the video in this case?

I have researched a bit on the UWP-MediaPlayerElement to but I can't find a way to get my byte[] into it.


Solution

  • First of all, thank you for your quick responses and interesting ideas!

    I took a look into Desktop Bridge but it is not really what I wanted, because my colleague has already put in a lot of effort to design the UWP-APP and my Windows-Form is just a botch to try things out.

    But the thing that really worked for me was StreamMediaInput . I have no idea how I missed this before. This way I just passed my NetworkStream directly to the MediaPlayer without using a Named-Pipe.

    LibVLC libVLC = new LibVLC();
    
    videoView1.MediaPlayer = new MediaPlayer(libVLC);
    Media streamMedia = new Media(libVLC, new StreamMediaInput(Client.Channels.VideoConnection.stream), ":demux=h264");
            
    videoView1.MediaPlayer.EnableHardwareDecoding = true;
    videoView1.MediaPlayer.FileCaching = 0;
    videoView1.MediaPlayer.NetworkCaching = 500; 
    
    videoView1.MediaPlayer.Play(streamMedia);
    

    This solution is now working for me both, in UWP and in Windows-Forms.