Search code examples
parse-serverparse-cloud-codelong-polling

parse-server and long-polling


I've got a client who has some custom hardware with fairly limited capabilities (power and connectivity speaking) and a home-built backend (IIS + SQL Server) solution. Once of the main ways this setup deals with asynchronous-type communication is through a variant of long-polling with reasonable timeouts.

I'm investigating the possibility of switching their backend solution out with parse-server, and I'm curious if anybody has either experience or insight in using parse-server with long-polling requests. In particular, I'm curious if it's possible to achieve long-polling using Cloud Code (my guess is that it is possible) and whether it's a viable route with regard to resources and performance.

I suspect it will be fine, looking at what others have written about Node JS' efficiency when holding open connections, but I'd love to hear from anybody with knowledge of the matter.


Solution

  • So I created a simple test to see how parse-server handles long polling. I created a Cloud Code endpoint which leaves the connection open for 30 seconds before returning:

    async function sleep(seconds) 
    {
        var millisecondsToWait = seconds * 1000
        return new Promise(resolve => setTimeout(resolve, millisecondsToWait));
    }
    
    Parse.Cloud.define('test', async function (req) {
        await sleep(30)
        return 'Hi'
    })
    

    And I wrote a bash script which fires a a lot of requests at the endpoint in a loop:

    #!/bin/bash
    
    for i in {0..200}
    do
        curl -X POST -H 'X-Parse-Application-Id: myapp' -H 'X-Parse-REST-API-Key: someString' -H 'Content-Type: application/json' http://localhost:5050/parse/functions/test &
    done
    

    The result suggests that parse-server doesn't have any issue with long-polling. Resource usage of the node process didn't change significantly during the test.

    So in the unlikely event anybody finds themself with this need in the future, you should be safe.