I'm writing a program to resolve about 10,000 CNAMEs.
The problem I have is that dns.resolveCname()
returns Error: queryCname ESERVFAIL
error when the number of CNAMEs to be resolved gets too large.
The code looks like the following:
const dns = require('dns')
dns.setServers(['8.8.8.8']) // provided by google
let cnames = [....] // length of cnames is 10,000
let promiseArr = []
for (let i = 0; i < cnames.length; i += 1) {
let p = new Promise((resolve, reject) => {
dns.resolveCname(cnames[i], (err, records) => {
if (err) {
console.log(err) // this line generates Error: queryCname ESERVFAIL
resolve() // sorry, I forgot adding this line.
} else {
console.log(records)
resolve() // sorry, I forgot adding this line.
}
})
})
promiseArr.push(p)
}
Promise.all(promiseArr)
.then(value => {
console.log(`Promise.all done`)
})
.catch(err => {
console.log(`promise err: ${err}`)
})
Does it mean that I cannot use dns.resolveCname()
too frequently?
Is it possible to avoid this problem by decreasing the frequency I trigger dns.resolveCname()
?
Or are there any other ways I can overcome this problem?
I'm using node.js v6.2.2.
ESERVFAIL
- The application queries forapi.example.com
, there is a SERVFAIL error thrown back and the call fails.
Promise.all
will catch
(stop the execution) if at least one of the async functions is throwing an error. So you can't use Promise all in this case, and for 10000 async calls it is a bad practice, you can easily dump out all your memory.
One way to solve your problem is implement a parallel limited queue
which will resolve the errors and the results, and then you just can manage to output the results in a proper manner.
One library that I found that implements the limited queue is cwait
From their documentation:
import * as Promise from 'bluebird';
import {TaskQueue} from 'cwait';
/** Queue allowing 3 concurrent function calls. */
var queue = new TaskQueue(Promise, 3);
Promise.map(list, download); // Download all listed files simultaneously.
Promise.map(list, queue.wrap(download))); // Download 3 files at a time.