I am receiving my imageset from a WebAPI as a list of ImageData objects each containing a bytearray.
public class ImageData
{
public byte[] data;
}
When I have many small size images, I can receive the API response and deserialize it using JObject without any issues. Below works perfectly.
using (var sr = new StreamReader(stream))
using (var jr = new JsonTextReader(sr))
{
while (jr.Read())
{
if (jr.TokenType == JsonToken.StartObject)
{
imageData = JObject.Load(jr).ToObject<ImageData>();
}
}
}
However, sometimes I have a single very large image file ( more than 200MB). In this case, regular deserializing method does not work. I keep getting OutOfMemory exceptions.
I tried reading the response in buffers and getting the bytearray but the end size of all read bytes is always bigger than the actual image size. If the original image size is around 220MB, what I end up with is around 295MB, I believe due to encoding. So the image can never be properly written. Below is how I do buffered reading.
byte[] buffer = new byte[1024];
List<byte[]> imageBytes = new List<byte[]>();
while (true)
{
int read = stream.Read(buffer, 0, buffer.Length);
if (read <= 0)
break;
imageBytes.Add(buffer);
}
var output = new byte[imageBytes.Sum(arr => arr.Length)];
int writeIdx = 0;
foreach (var byteArr in imageBytes)
{
byteArr.CopyTo(output, writeIdx);
writeIdx += byteArr.Length;
}
imageData = new ImageData() { data = output };
What am I missing here? How can I accomplish getting the image data from this huge payload without memory exceptions or extra bytes?
---- Update ---
I tried with below, but still larger number of bytes than the original.
while (true)
{
read = await stream.ReadAsync(buffer, 0, 1024);
++count;
if (read <= 0)
bytesRead += read;
ms.Write(buffer, 0, read);
}
imageData = new ImageData() { data = ms.ToArray() };
Tried using a FileStream, temp.dcm size is again about 290MB, while original image is about 210MB:
string file = @"C:\Test\\temp.dcm";
using (FileStream fs = new FileStream(file, FileMode.Create, FileAccess.Write,
FileShare.None, 4096, useAsync: true))
{
await response.Content.CopyToAsync(fs);
}
So, obviously there is no easy way to deserialize a huge size response containing a single object directly without getting memory exceptions. Instead I ended up alternating my API responses.
If there are many smaller size images, I send them as list of ImageData objects as usual and use JObject to deserialize.
If there is one single large image, I am only sending the byte array from the API, and reading the bytes on the received response.