Search code examples
azureazure-storage

Getting an error when uploading a file to Azure Storage


I'm converting a website from a standard ASP.NET website over to use Azure. The website had previously taken an Excel file uploaded by an administrative user and saved it on the file system. As part of the migration, I'm saving this file to Azure Storage. It works fine when running against my local storage through the Azure SDK. (I'm using version 1.3 since I didn't want to upgrade during the development process.)

When I point the code to run against Azure Storage itself, though, the process usually fails. The error I get is: System.IO.IOException occurred

  Message=Unable to read data from the transport connection: The connection was closed.
  Source=Microsoft.WindowsAzure.StorageClient
  StackTrace:
       at Microsoft.WindowsAzure.StorageClient.Tasks.Task`1.get_Result()
       at Microsoft.WindowsAzure.StorageClient.Tasks.Task`1.ExecuteAndWait()
       at Microsoft.WindowsAzure.StorageClient.CloudBlob.UploadFromStream(Stream source, BlobRequestOptions options)
       at Framework.Common.AzureBlobInteraction.UploadToBlob(Stream stream, String BlobContainerName, String fileName, String contentType) in C:\Development\RateSolution2010\Framework.Common\AzureBlobInteraction.cs:line 95
  InnerException: 

The code is as follows:

public void UploadToBlob(Stream stream, string BlobContainerName, string fileName,
        string contentType)
    {
        // Setup the connection to Windows Azure Storage
        CloudStorageAccount storageAccount = CloudStorageAccount.Parse(GetConnStr());

        DiagnosticMonitorConfiguration dmc = DiagnosticMonitor.GetDefaultInitialConfiguration();
        dmc.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);
        dmc.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
        DiagnosticMonitor.Start(storageAccount, dmc);      
        CloudBlobClient BlobClient = null;
        CloudBlobContainer BlobContainer = null;
        BlobClient = storageAccount.CreateCloudBlobClient();

        // For large file copies you need to set up a custom timeout period
        // and using parallel settings appears to spread the copy across multiple threads
        // if you have big bandwidth you can increase the thread number below
        // because Azure accepts blobs broken into blocks in any order of arrival.
        BlobClient.Timeout = new System.TimeSpan(1, 0, 0);
        Role serviceRole = RoleEnvironment.Roles.Where(s => s.Value.Name == "OnlineRates.Web").First().Value;
        BlobClient.ParallelOperationThreadCount = serviceRole.Instances.Count;  

        // Get and create the container
        BlobContainer = BlobClient.GetContainerReference(BlobContainerName);
        BlobContainer.CreateIfNotExist();

        //delete prior version if one exists
        BlobRequestOptions options = new BlobRequestOptions();
        options.DeleteSnapshotsOption = DeleteSnapshotsOption.None;
        CloudBlob blobToDelete = BlobContainer.GetBlobReference(fileName);
        Trace.WriteLine("Blob " + fileName + " deleted to be replaced by newer version.");
        blobToDelete.DeleteIfExists(options);

        //set stream to starting position
        stream.Position = 0;
        long totalBytes = 0;
        //Open the stream and read it back.
        using (stream)
        {
            // Create the Blob and upload the file
            CloudBlockBlob blob = BlobContainer.GetBlockBlobReference(fileName);
            try
            {
                BlobClient.ResponseReceived += new EventHandler<ResponseReceivedEventArgs>((obj, responseReceivedEventArgs)
                =>
                {
                    if (responseReceivedEventArgs.RequestUri.ToString().Contains("comp=block&blockid"))
                    {
                        totalBytes += Int64.Parse(responseReceivedEventArgs.RequestHeaders["Content-Length"]);
                    }
                });                 
                blob.UploadFromStream(stream);
                // Set the metadata into the blob
                blob.Metadata["FileName"] = fileName;
                blob.SetMetadata();
                // Set the properties
                blob.Properties.ContentType = contentType;
                blob.SetProperties();
            }
            catch (Exception exc)
            {
                Logging.ExceptionLogger.LogEx(exc);
            }
         }
     }

I've tried a number of different alterations to the code: deleting a blob before replacing it (although the problem exists on new blobs as well), setting container permissions, not setting permissions, etc.


Solution

  • The problem turned out to be firewall settings on my laptop. It's my personal laptop originally set up at home and so the firewall rules weren't set up for a corporate environment resulting in slow performance on uploads and downloads.