Search code examples
mongodbamazon-web-servicesamazon-ec2large-data

MongoDB cant access document above at specific skip


I have a MongoDB instance in a cloud on AWS EC2 t2.micro (30GB storage, 1GB ram) running in Docker and in that database I have a single collection which stores 411 thousand documents, an this takes ~700MB disk space.

robomongo

On my local computer, if I run this in mongo shell:

db.my_collection.find().skip(200000).limit(1)

then I get the correct results, but if I run this

db.my_collection.find().skip(220000).limit(1)

then MongoDB shuts down. Why? What should I do, to access these data?


Solution

  • It appears that your system doesn't have enough RAM to fulfill mongodb demand. When a Linux system is critically low in memory, kernel starts killing processes to avoid system crash itself.

    I believe, this is what happening in your case too. Mongodb is not even getting chance to write a log. I'd recommend to increase RAM or if it's not feasible, add more swap space. This will prevent system crash but mongodb will keep working though very very slow.

    Please visit these excellent resources on Linux and it's behavior.

    https://unix.stackexchange.com/questions/136291/will-linux-start-killing-my-processes-without-asking-me-if-memory-gets-short

    https://serverfault.com/questions/480266/how-to-know-if-the-server-runs-out-of-ram-before-crashing-down