Search code examples
javascriptasynchronousfetch-api

Fetch API await chunk of defined chunk size


I want to fetch an URL and process the response in chunks of a defined size. I also don't want to block while waiting for the whole chunk to be available. Is something like this available in the Fetch API?

Example how it could look like:

const response = await fetch(url)
const reader = response.body.getReader()
const chunk = await reader.read(CHUNK_SIZE) 


Solution

  • There is support in fetch() to be consumed like a stream. See MDN reference here. It appears that you need some boilerplate code for ReadableStream...

    Code would like this:

    const workOnChunk = (chunk) => { console.log("do-work")};
    
    // Fetch your stuff  
    fetch(url)
    // Retrieve its body as ReadableStream
    .then(response => response.body)
    
    // Boilerplate for the stream - refactor it out in a common utility.
    .then(rs => {
      const reader = rs.getReader();
    
      return new ReadableStream({
        async start(controller) {
          while (true) {
            const { done, value } = await reader.read();
    
            // When no more data needs to be consumed, break the reading
            if (done) {
              break;
            }
    
            // Do your work: ¿¿ Checkout what value returns ¿¿
            workOnChunk(value)
    
            // Optionally append the value if you need the full blob later.
            controller.enqueue(value);
          }
    
          // Close the stream
          controller.close();
          reader.releaseLock();
        }
      })
    })
    // Create a new response out of the stream (can be avoided?)
    .then(rs => new Response(rs))
    // Create an object URL for the response
    .then(response => response.blob())
    .then(blob => { console.log("Do something with full blob") }
    .catch(console.error)
    
    

    NOTE: The nodejs-fetch API is not exactly the same. If you are on nodejs, see nodeje-fetch's stream support.