Is there a way to have a list property that can be empty and whose items are unique (throughout the whole collection and not just the list itself).
Minimal example of the code I currently have:
const innerItemSchema = new Schema({
foo: { type: "string", required: true, unique: true, },
});
const outerItemDefinition = {
list: { type: [innerItemSchema], required: true },
} as const;
export type RawOuterItem = InferRawDocType<typeof outerItemDefinition> & { _id?: Types.ObjectId };
const outerItemSchema = new Schema(outerItemDefinition);
export const OuterItem = model("OuterItem", outerItemSchema);
foo
values throughout the collectionMongoServerError: E11000 duplicate key error collection: db.outerItems index: list.foo_1 dup key: { list.foo: null }
(Wrong behavior)foo
value exists more than once within the same list (Wrong behavior)Here is a list of examples of the expected and current behavior:
[{ list: [{foo: "a"}, {foo: "a"}] }]
// Should throw of duplicate foo but doesn't
[{ list: [{foo: "a"}] }, { list: [{foo: "a"}] }]
// Throws because of duplicate foo
[{ list: [{foo: "a"}, {foo: "b"}] }]
// Ok
[{ list: [{foo: "a"}] }, { list: [{foo: "b"}] }]
// Ok
[{ list: [] }]
// Ok
[{ list: [] }, { list: [] }]
// Should be ok but throws because of duplicate foo
Is there way to change the schema definition to allow for duplicate lists?
I would rather not remove the mongoose duplicate check and then have to manually check for duplicates before each insertion or update.
First of all, drop the existing index. You can use Compass or mongosh for this:
db.outerItems.dropIndex("list.foo_1");
Secondly, remove the unique: true
constraint from the innerItemSchema
const innerItemSchema = new Schema({
foo: { type: "string", required: true},
});
Lastly, make use of the Schema.index() method to add the index back in again but this time with a partialFilterExpression
that checks if list.foo
actually exists on the outerItemSchema
:
const outerItemSchema = new Schema(outerItemDefinition);
outerItemSchema.index(
{ "list.foo": 1 },
{
unique: true,
partialFilterExpression: {
"list.foo": {
$exists: true
}
},
}
);
export const OuterItem = model("OuterItem", outerItemSchema);
If you clear out the collection and restart your application server you should be able to now save multiple documents with an empty array for the list
field.
Edit:
To answer your comment - no the index can't be modified to accommodate "sibling" elements in list
array.
However, with mongoose you can simply add a Custom Validator to the links
field. Before a document is saved in the database mongoose will run the validator against the array to ensure 2 or more objects in the list
array don't share the same value for foo
:
const outerItemDefinition = {
list: {
type: [innerItemSchema],
required: true,
validate: [function(doc) {
for(let d of doc){
const duplicates = this.list.filter(function(c){
return c.foo === d.foo;
})
if(duplicates.length > 1){
return false;
}
}
}, 'Each list.foo must be unique.']
},
}
This validator in combination with the new index should cover both bases. You can refactor the validator to make it more efficent but I think this explains the looping to check for any duplicates pretty well.
Please note that when doing an update such as findOneAndUpdate
or updateMany
etc you will need to pass in the { runValidators: true }
option to make sure the validator is run as part of the update. This is something that is typically overlooked when using mongoose. See Update Validators in the official docs for more explanation.