Search code examples
node.jsconcurrencyfetchtimeout

Under high concurrency, frequent timeouts occur when using fetch to pull resources in NodeJs


Now I have written a program like the following:

// NodeJs version: v20.16.0

async function fetchFile(url: string, timeout: number): Promise<Response> {
    const controller = new AbortController()
    const id = setTimeout(() => controller.abort('timeout: ' + url), timeout)
    return fetch(url, {signal: controller.signal})
        .finally(() => clearTimeout(id))
}

const testUrl = 'https://lf3-cdn-tos.bytecdntp.com/cdn/expire-1-M/KaTeX/0.15.2/contrib/copy-tex.min.js'
const count = 100
const timeout = 5000

async function test() {
    for (let i = 0; i < count; i++) {
        try {
            await Promise.all(
                new Array(count).fill(testUrl).map(it => fetchFile(it, timeout))
            )
            console.log('success: ' + i)
        } catch (e: any) {
            console.error(`error[${i}]: ${e}`)
        }
    }
}

test()

When I run this code, frequent access timeout issues often occur, as shown in the image below.

console output

The probability of timeouts increases with the rise in concurrency, and I encountered similar issues when using node-fetch.

In this code, the same resource is being repeatedly accessed. In my actual project, I am accessing different resources concurrently, but I wrote it this way here for testing convenience. In actual testing, the resources that time out seem to be fixed; it's always a few specific links that time out. However, when I test these links individually, they all successfully fetch within 200 milliseconds (I set the timeout to 5 seconds). The timeout issue only occurs when multiple links are fetched simultaneously.

I want to know what causes this issue and if there is a way to ensure that timeouts do not occur due to high concurrency while maximizing the level of concurrency.


Solution

  • I had come across similar issues while working with our document management system while fetching files via our downloader utility. You can try these solutions:

    1. Limit the number of concurrent requests.

    2. Implement retries and exponential backoffs for failed requests.

    3. Consider using different libraries or tools that handle concurrency more effectively.

    4. Monitor and adjust timeout settings and be aware of server limits.

    npm p-limit library allows you to create a concurrency limit on promises. It’s useful when you need to limit the number of concurrent asynchronous operations.