I want to insert large amounts of data to SQL Server via knex. I've lowered the number of inserts at once and concurrency of inserts but I still end up with high CPU usage. Could it have something with knex starting a connection for each insert? Any input would be appreciated thanks. This is the code I used to connect to the db.
client: "mssql",
connection: {
server: productionDbIp,
user: productionDbUsername,
password: productionDbPassword,
database: productionDbName,
options: {
port: productionDbPort
},
pool: { min: 0, max: 7 }
}
Here is the code used to que the inserts
const insertQue = new PQueue({ concurrency: 1 });
// format inserts returns an array of objects that are ready to be inserted
const [inserts] = await formatInserts(distinctData);
const insertPromises = [];
while (inserts.length > 0) {
const insert = inserts.splice(0,10);
try {
insertPromises.push(
insertQue.add(() => insertToDb(insert, 'tableName'))
);
} catch (error) {
console.log(error);
}
}
try {
await Promise.all(insertPromises);
} catch (error) {
console.log(error);
}
}
async function insertToDb(data, table, numTry = 0) {
try {
return await knex(table).insert(data);
} catch (error) {
if (numTry > 3) {
console.log(error);
throw { message: `Error inserting data`, data };
}
return insertToDb(data, numTry + 1);
}
}
A quick Google search show this answer https://apple.stackexchange.com/q/240204
So having 300% might not even be an issue. Your virtual core are gearing up to do their job. Since you are using async calls multiple proc can work together.