Search code examples
node.jsnext.jsbackendnext.js13

How does Next API Route handle multiple concurrent requests?


Suppose I have created the next app that has an API route. Suppose two users send a request to that API to upload a file to an S3 bucket, then my backend would do some processing on the file which could take more than 2 minutes. How is that handled? And how is it ensured that the users get their own processed files instead of each other's? So essentially how does this API serve multiple users assuming it's being served using just one server?

Is there a built in message queue that queues the requests for processing and returns the appropriate responses using http headers?


Solution

  • There is no built-in message queue in Next.js itself.

    next api is just a node.js server. so your question could be asked "How does Node.js handle multiple concurrent requests?"

    Node.js uses a single thread with an event-loop. In this way, Node can handle 1000s of concurrent connections.

    From here

    Node took a slightly different approach to handling multiple concurrent requests at the same time if you compare it to some other popular servers like Apache. Spawning a new thread for each request is expensive. Also, threads are doing nothing when awaiting other operations’ result (i.e.: database read).

    Such an approach has numerous advantages. No overhead comes with creating new threads. Also, your code is much easier to reason about, as you don’t have to worry about what will happen if two threads access the same variable. It’s because that simply cannot happen. There are some drawbacks as well. Node isn’t the best choice for applications that mostly deal with CPU intensive computing. On the other hand, it excels at handling multiple I/O requests.

    File uploading is considered an I/O (Input/Output) operation in the context of web applications. When a user uploads a file through a web interface, the server must read the data from the client's device (input) and then write it to a storage location (output). Both reading from the client and writing to storage are I/O operations.

    In a nutshell, async code first gets registered. event loop continuously checks the status of registered async operations. It manages a queue of events and their associated callback functions. When an async operation is completed, it is pushed to the event loop's queue. The event loop schedules the associated callback function to be executed on the main thread when the thread becomes available.