I have a problem with the following query in MongoDB shell ONLY when the size of the array gets bigger, for example, more than 100 elements.
newPointArray --> is an array with 500 elements
newPointArray.forEach(function(newDoc){
//update the mongodb properties for each doc
db.getCollection('me_all_test')
.update({ '_id': newDoc._id },
{ $set: { "properties": newDoc.properties } },
{ upsert: true });
})
Can someone guide me how can I run this query IN MongoDB SHELL for lager array by using an async loop or promise or...?
Thanks in advance
Rather than doing individual .update()
s, use a .bulkWrite()
operation. This should reduce the overhead of asking mongo to do multiple individual operations. This is assuming that you are doing general operations. I'm not clear on if newPointArray
is always new points that don't exist.
Given your example, I believe your script would mimic the following:
// I'm assuming this is your array (but truncated)
let newPointArray = [
{
_id: "1",
properties: {
foo: "bar"
}
},
{
_id: "2",
properties: {
foo: "buzz"
}
}
// Whatever other points you have in your array
];
db
.getCollection("me_all_test")
.bulkWrite(newPointArray
// Map your array to a query bulkWrite understands
.map(point => {
return {
updateOne: {
filter: {
_id: point._id
},
update: {
$set: {
properties: point.properties
}
},
upsert: true
}
};
}));
You may also want to consider setting ordered
to false in the operation which may also have performance gains. That would look something liked this:
db
.getCollection("me_all_test")
.bulkWrite([SOME_ARRAY_SIMILAR_TO_ABOVE_EXAMPLE], {
ordered: false
});