I have a large mongo collection, when I run {"$and":[{"fieldCheck":"1"},{"timestamp":{"$lt": MAX_SAFE_INTEGER}}]}
, the query results the output fast, but when i sort by {timestamp: -1, _id: -1}
with the previous filters then the query takes forever.
The collection is indexed with composite index {timestamp: 1, _id: 1}
, also fieldCheck
is indexed with _id
{fieldCheck: 1, _id: 1}
Whats wrong is going on?
I tried indexing mentioned.
The MongoDB query executor can only use a single index for a query. The predicate {"timestamp":{"$lt": MAX_SAFE_INTEGER}}
is equivalent to testing if timestamp is a number, so that will not be very selective.
When there is no sort, the query executor can use the index on {fieldCheck: 1, _id: 1}
to select only documents that match {"fieldCheck":"1"}
, and then fetch each of those to test the timestamp. This allow only fetching documents that actually match the query.
When there is a sort stage, the query planner has 2 choices:
{"fieldCheck":"1"}
index to select documents, and then sort them in memory (in-memory sort is a very heavy process, and usually slow){timestamp: -1, _id: -1}
index to find documents in pre-sorted order, and fetch each one to test the fieldCheck
. While this does not require in-memory sorting, it will require reading nearly the entire collection.Neither of these options are particularly performant.
If you create an index on {"fieldCheck":1, timestamp: -1, _id: -1}
, the query planner will be able to use this one index for both effective document selection and sorting, which should be much faster.