Search code examples
c#azureazure-durable-functionsazure-api-apps

Azure ScheduleNewOrchestrationInstanceAsync - input size


I was tasked with creating API Azure Function for file upload to Blob Storage Container. File is in base64 format, passed to funtion with POST method, inside of the function body in json format with more params important to upload.

Simple workflow would be:

  1. Caller calls HTTP Trigger (Asynch Durable Function).
  2. Function will schedule new Orchestration Instance.
  3. Instance will call Activity Trigger for upload

But when I tried to upload file larger than 4 Mb, I get this error:

Exception: System.AggregateException: One or more errors occurred. (Status(StatusCode="ResourceExhausted", Detail="Received message larger than max (5876735 vs. 4194304)"))
[2023-07-10T12:41:01.044Z]  ---> Grpc.Core.RpcException: Status(StatusCode="ResourceExhausted", Detail="Received message larger than max (5876735 vs. 4194304)")
[2023-07-10T12:41:01.046Z]    at Microsoft.DurableTask.Client.Grpc.GrpcDurableTaskClient.ScheduleNewOrchestrationInstanceAsync(TaskName orchestratorName, Object input, StartOrchestrationOptions options, CancellationToken cancellation)

More specificly, this is the part of code that fails with large files:

// Function input comes from the request content.
StartOrchestrationOptions options = new StartOrchestrationOptions {InstanceId = CorreliationId};
instanceId = await client.ScheduleNewOrchestrationInstanceAsync(nameof(UploadFileOrchestrator), requestBody, options);

requestBody is json with file inside, so it is Instance Input. Error tells me, that Input data cant be larger than 4 Mb. How can I bypass this problem and increase max input size?

I was unable to find any documentation on input size of orchestrator.


Solution

  • I believe this is the default limit for the gRPC Server used internally by Durable Functions for .NET Isolated.

    You could use the in-process model which should not have this limit or use the claim check pattern (which Durable Functions also uses in general) to work around this.

    Also, open a bug on the GitHub repo to get this addressed.