Search code examples
azureazure-storageazure-worker-rolesazure-table-storage

Azure table Storage throws exception : Unable to read data from the transport connection:


I'm running a long Azure table storage query for 6-7 hrs after 5-6 hrs of time, Azure table storage throws exception has "Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.An existing connection was forcibly closed by the remote host " **

    "Exception : Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host., Stack Trace :    at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync[T](RESTCommand`1 cmd, IRetryPolicy policy, OperationContext operationContext)
       at Microsoft.WindowsAzure.Storage.Table.TableQuery`1.<>c__DisplayClass7.<ExecuteInternal>b__6(IContinuationToken continuationToken)
       at Microsoft.WindowsAzure.Storage.Core.Util.CommonUtility.<LazyEnumerable>d__0`1.MoveNext()
       at System.Collections.Generic.List`1..ctor(IEnumerable`1 collection)
       at System.Linq.Enumerable.ToList[TSource](IEnumerable`1 source)"

**

There is no clue what is causing problem , Can any one help me out reason for this error.

 ServicePointManager.DefaultConnectionLimit = 48;
            ServicePointManager.Expect100Continue = false;
            ServicePointManager.UseNagleAlgorithm = false;

I'm using A7 (8 CPU core, 56 GB RAM), even with high configuration also it is failing.

Also included Retry logic on Table Storage suring execution of Query , but no luck.

 var DefaultRequestOptions = new TableRequestOptions
                            {
                                RetryPolicy =new ExponentialRetry(TimeSpan.FromSeconds(3), 3),
                                //PayloadFormat = TablePayloadFormat.JsonNoMetadata
                            };
 AzureTableQuery.Execute(DefaultRequestOptions).ToList();

I also check Network IN : it is showing has 100 GB. Is there any limit on Network bandwith. I request any one help on this. Thanks in advance


Solution

  • For a query that takes this long, it’s much better to process the results piecemeal, rather than try and download everything at once. This way, if your query fails at any point, you don’t have to re-download everything. For example:

            TableContinuationToken token = null;
            try
            {
                do
                {
                    TableQuerySegment<ITableEntity> segment = AzureTableQuery.ExecuteSegmented(token);
                    // Do something with segment.Results(), which is this batch of results from the query
                    List<ITableEntity> results = segment.Results;
    
                    // Save the continuation token for the next iteration.
                    token = segment.ContinuationToken;
    
                } while (token != null);
            }
            catch (Exception e)
            {
                // Handle exception, retry, etc
            }
    

    This way, you have partial results even if the query fails partway through, and you have the continuation token, so you can resume the query from where you left off, rather than starting at the beginning.

    Please note that most table scans are not very efficient; if your scenario is latency sensitive you may want to redesign your table to allow more efficient queries. Also, I’m not sure how you’re getting 100 GB/s on the network, but it’s definitely not all coming from this one query, Azure Storage won’t push data that rapidly for one query.