Search code examples
microservicesdomain-driven-designdelayapi-design

Designing a microservice API for delayed response in complex calculations with request concurrency control


I would appreciate it if someone could guide me towards the best approach to implement this idea:

I have a microservice (perhaps an imprecise description) that acts as an API provider for a specific functionality, involving somewhat complex calculations, and therefore, it takes some time to process requests.

I want to communicate with the microservice and request a calculation without waiting for an immediate response. Instead, I will request the result after some time.

At the same time, I want to prevent the user from initiating a new calculation request as long as the previous one is still in progress.

One idea I've thought about is giving each calculation a unique identifier code and logging it in the service's records. At the same time, the identifier will be stored for the user, so when the user sends another request to the API, they include the last identifier.

In the service, I can check if a particular calculation is completed or still ongoing based on the provided identifier and either initiate a new one if the previous is finished or reject the request if it's still processing.

I'm not sure if this is an appropriate solution or if there are better alternatives out there. I'm genuinely searching for the best possible solution.


Solution

  • What you describe is a classic asynchronous pattern:

    • Client ask service to start a process and respond with a process identifier
    • Client can poll the service for process status periodically
    • When the process has completed, the client can request the result

    An alternative to this is the push pattern:

    • Client publishes a response entrypoint accessible from the service
    • Client ask the service to start a process and send results to the entrypoint
    • When the process has completed, the service sends the result to the client

    A synchronous alternative uses pub/sub with streaming protocols such as gRPC. The communication is similar to push as the server will be sending the results to the clients without clients polling the service. However, the connection is synchronous and initiated by the clients, which might prove usefull if they are behind NAT/proxy.