Search code examples
node.jstcp

Is it possible to encounter race condition when processing data with a Node.js TCP server?


I am setting up a TCP server using Node.js, and on 'data' event I pass the data to an instance of parser object which processes the data and then push to the cloud. My question is that in the event where the first on data event hasn't finished processing, but the second on 'data' event of the same TCP connection has been fired, what would happen to the data, will the second data packet be processed 'concurrently' as the first one?

This TCP server is meant to handle multiple connections and each connection is handling fairly large amount of data continuously. And It is important for the data to be processed sequentially.

var net = require('net');
const rfDataParser = require('./MyDataParser');
const HOST = 'localhost'
const PORT = 8000;

var server = net.createServer(onClientConnected);

function onClientConnected(sock) {
    const parser = new rfDataParser();
    sock.on('data', function(data) {
        parser.parse(data);
    }
}

server.listen(PORT, HOST, function() {  
    console.log('server listening on %j', server.address());
});

I expect the data of the second 'data' event not get processed until the data of the first 'data' event has finished processing.


Solution

  • You're asking two different questions here,

    Will the second data packet be processed 'concurrently' as the first one?

    No, the event-loop will queue up the handling of the second event if the first one is still being handled. Node.js is a single threaded platform; There's no way two lines in a JS file are executed concurrently.

    Is it possible to encounter a race condition?

    Yes, while while your code runs in a single thread, node itself relies on multiple threads to handle I/O. If your processing includes asynchronous I/O and you do not wait for a callback before calling the job done then a race condition is possible.

    You can read more about the event loop here.