I want to parse a MJPEG stream with net7.0 on iOS. My current solution is working on windows and android, but on iOS I can not receive the raw stream.
What is MJPEG? MJPEG is a constant stream of jpegs. The content starts with header informations, after that is an image, and after that there is a header again in the content. Example of a response:
Content-Type: image/jpeg
Content-Length: 50706
{Image}
Content-Type: image/jpeg
Content-Length: 50750
{Image2}
After the header, there are always 2 line breaks.
So I parse the response to find 2 line breaks, parse the Content-Lenght
and know, that the following 50706 bytes are the image. I extract the image from the stream parse for the next 2 line breaks and so on.
My code is like this:
HttpResponseMessage headerResponse = await client.GetAsync(requestUri, HttpCompletionOption.ResponseHeadersRead, cancellationToken);
Stream stream = headerResponse.Content.ReadAsStreamAsync();
{Parsing stuff} ...
My problem now is, that iOS is not handing over the raw content. There are no header informations in the content stream. It is like this:
{Image}
{Image2}
So there is no way for me to split the stream in single images.
The headerResponse.Content.GetContentLegth()
is not the correct value, also it is not changing while parsing the stream.
I already tried different HttpClients (NSUrlSessionHandler, CFNetworkHandler)
So is there a way, to prevent iOS from removing the Header informations from the content stream?
It is running now, thanks to 2 posts I found in the internet: Swift Mjpeg Streaming Only Showing Single Frame and Download Files in Xamarin iOS in the Background
My solution is a mix of both
public class MyImageReceiver
{
public async Task Get(string requestUri, Action<IEnumerable<byte>> imageReceived, int timeout, CancellationToken cancellationToken)
{
bool isTimeout = false;
DateTime lastImageReceivedAt = DateTime.Now;
var config = NSUrlSessionConfiguration.DefaultSessionConfiguration;
using (NSUrl nsUrl = NSUrl.FromString(requestUri))
using (NSMutableUrlRequest request = new NSMutableUrlRequest(nsUrl,
cachePolicy: NSUrlRequestCachePolicy.ReloadIgnoringLocalCacheData,
timeoutInterval: 2000))
using (CameraPictureClientSessionDelegate cameraPictureClientSessionDelegate = new CameraPictureClientSessionDelegate())
using (NSUrlSession session = NSUrlSession.FromConfiguration(config, cameraPictureClientSessionDelegate, null))
using (NSUrlSessionDataTask streamingTask = session.CreateDataTask(request: request))
{
request.HttpMethod = "GET";
cameraPictureClientSessionDelegate.ImageReceived += (s, e) =>
{
lastImageReceivedAt = DateTime.Now;
_waitHandle.Release();
imageReceived(e);
};
streamingTask.Resume();
do
{
await _waitHandle.WaitAsync(timeout, cancellationToken);
if ((DateTime.Now - lastImageReceivedAt).TotalMilliseconds > timeout)
{
isTimeout = true;
}
}
while ((streamingTask.State == NSUrlSessionTaskState.Running
|| streamingTask.State == NSUrlSessionTaskState.Suspended)
&& !cancellationToken.IsCancellationRequested
&& !isTimeout);
if (streamingTask.State == NSUrlSessionTaskState.Running
|| streamingTask.State == NSUrlSessionTaskState.Suspended)
{
streamingTask.Cancel();
}
if (isTimeout)
{
throw new TimeoutException();
}
}
}
}
And the delegate
public class CameraPictureClientSessionDelegate :
NSUrlSessionDataDelegate,
INSUrlSessionTaskDelegate
{
private SemaphoreSlim _semaphore = new SemaphoreSlim(1);
public event EventHandler<IEnumerable<byte>> ImageReceived;
List<byte> _data = new List<byte>();
public override void DidReceiveData(NSUrlSession session, NSUrlSessionDataTask dataTask, NSData data)
{
try
{
_semaphore.Wait();
_data.AddRange(data.ToArray());
}
finally
{
_semaphore.Release();
}
}
public override void DidReceiveResponse(NSUrlSession session, NSUrlSessionDataTask dataTask, NSUrlResponse response, Action<NSUrlSessionResponseDisposition> completionHandler)
{
try
{
_semaphore.Wait();
if (_data.Count > 0)
{
ImageReceived?.Invoke(this, _data);
_data = new List<byte>();
}
}
finally
{
_semaphore.Release();
}
completionHandler(NSUrlSessionResponseDisposition.Allow);
}
}