Search code examples
javascriptnode.jssalesforce-lightning

parsing Salesforce bulk api data into chunks to process


I have this below code where i am getting results from Bulk APi query job which huge in number (60,000 to 80,000) records. I am converting it to JSON and will be inserting into a database. Can anyone suggest best way to handle huge amount of data and how to process it in chunks.

request.get( options, function ( error, response, body ) {
        if ( error )
        {

        } else
        {
            csvJson()
                .fromString( response.body )
                .then( ( jsonObj ) => {
                    var a = JSON.stringify( jsonObj );
                } )
        }
    } );

Solution

  • async function* readApi() {
      let page = 1;
      while (page != null) {
        const r = await fetch(`http://target-api.com/stuff?page=${page}`)
        const d = await r.json()
        page = yield d
      }
    }
    
    const it = readApi()
    
    it.next() // Init fn with first next call it will get the first page
    it.next(2) // Gets the second page, process the data and go to next call
    // ..
    // ..
    it.next(null) // When you are done with getting data call with null