I have a heavy data processing operation that I need to get done per 10-12 simultaneous requests. I have read that for a higher level of concurrency Node.js is a good platform and it achieves it by having a non-blocking event loop.
What I know is that for having things like querying a database, I can spawn off an event to a separate process (like mongod
, mysqld
) and then have a callback that will handle the result from that process. Fair enough.
But what if I want to have a heavy piece of computation to be done within a callback? Won't it block other requests until the code in that callback is executed completely? For example, I want to process a high-resolution image, and the code I have is in Javascript itself (no separate
process to do image processing).
The way I think of implementing is like
get_image_from_db(image_id, callback(imageBitMap) {
heavy_operation(imageBitMap); // Can take 5 seconds.
});
Will that heavy_operation
stop the node from taking in any request for those 5 seconds? Or am I thinking the wrong way to do such a task? Please guide me, I am a JS newbie.
UPDATE
Or can it be like I could process a partial image and make the event loop go back to take in other callbacks and then return back to process that partial image? (something like prioritizing events).
Yes it will block it, as the callback functions are executed in the main loop. It is only the asynchronously called functions which do not block the loop. It is my understanding that if you want the image processing to execute asynchronously, you will have to use a separate processes to do it.
Note that you can write your own asynchronous process to handle it. To start you could read the answers to How to write asynchronous functions for Node.js.
UPDATE
how do i create a non-blocking asynchronous function in node.js? may also be worth reading. This question is actually referenced in the other one I linked, but I thought I'd include it here to for simplicity.