My C# project is used to do 2 actions:
I recently got a requirement to include one more step between step 1 and 2. It is to save a copy of this stream as a file in local folder before saving it to Azure blob. When I copy this stream to local using streamObject.CopyTo(localFileLocation), then the source stream is getting blank and the blob file become 0 size.
string content = string.Empty;
var stream = await this.DownloadFromSPO(pipeline).ConfigureAwait(false);
if (stream == null)
{
return false;
}
// saving stream as file into local folder
string filePath2 = Path.Combine("C:\\localLocation", pipeline.Name);
using (StreamReader reader = new StreamReader(stream))
{
content = reader.ReadToEnd();
FileStream outputFileStream = new FileStream(filePath2, FileMode.Create);
stream.Position = 0;
stream.CopyTo(outputFileStream);
}
stream = helper.GetStream(content);
// Upload the file to BLOB Storage using drive item id as the blob file name
CloudBlob blob = null;
await AzureRetryHelper.OperationWithBasicRetryAsync(async () =>
{
blob = await this.blobHelper.UploadFileToBlob(
stream,
this.config.GetConfigValue(Constants.DataContainerName),
pipeline.BlobNativeFilePath,
true,
pipeline.MimeType).ConfigureAwait(false);
}).ConfigureAwait(false);
that is why i am trying to add below code:
stream = helper.GetStream(content); // to take the stream back and
then save it to blob storage.
here GetStream function has below code:
public static Stream GetStream(string fileContent)
{
MemoryStream ms = new MemoryStream();
StreamWriter writer = new StreamWriter(ms);
writer.WriteLine(fileContent);
writer.Flush();
// Rewind the MemoryStream before copying
ms.Seek(0, SeekOrigin.Begin);
return ms;
}
After this i am getting corrupted saved blob file.
Any help around it?
I'm not sure what the error is, but converting the stream to a string, and then writing that string to a new memory stream is almost certainly not the way to do it.
You need to be somewhat careful when using streams since not all streams are "seekable", this is especially true when dealing with network streams. After all you cannot ask the server to resend bytes randomly.
So I would suggest to copy the stream to memory, and then use this for whatever you want to do:
using var ms = new MemoryStream();
stream.CopyTo(ms);
ms.Position = 0;
using var outputFileStream = new FileStream(filePath2, FileMode.Create);
ms.CopyTo(outputFileStream);
ms.Position = 0;
CloudBlob blob = null;
await AzureRetryHelper.OperationWithBasicRetryAsync(async () =>
{
blob = await this.blobHelper.UploadFileToBlob(
ms,
this.config.GetConfigValue(Constants.DataContainerName),
pipeline.BlobNativeFilePath,
true,
pipeline.MimeType).ConfigureAwait(false);
}).ConfigureAwait(false);
Note that this might fail for very large streams. As far as I know memory streams are limited to 2Gb due to the number of elements an array can hold.