I am facing out of memory errors during file upload test execution. I am running the test from an ec2 m4.xlarge instance (16 gb RAM) and have allocated 80% of the memory as Jmeter heapsize. During the test CPU util is hitting 100% , whole memory is consumed (around 12 gb) and huge java_pid***.hrpof (heap dump) file is created in the Bin folder.
File upload size : Mix of 200 kb , 400 mb , 1.5 gb files
No of Total threads : 50
Jmeter version : 3.3
I have tried the below suggested by different forums, but didnt work:
Has anyone faced this and how did you fix this?
Also, how to disable the huge(3-5GB) java_pid***.hrpof dump file getting generated?
50 threads * 1.5 GB == 75 GB
while you have from 3 to 5 GB allocated to JMeter so it is definitely not enough.
You need either to use something like m4.10xlarge with 160 GB RAM or m5d.12xlarge with 192 GB RAM in order to be able to upload that big files with that many threads.
Another option is considering switching to Distributed Testing but you will need to kick off more m4.xlarge
instances
You can also try switching to HTTP Raw Request sampler which has nice feature of streaming file directly to the server without pre-loading it into memory so theoretically you should be able to simulate file uploads even on that limited instance, however it might not fully reflect real life scenario. You can install HTTP Raw Request sampler using JMeter Plugins Manager
To disable heap dump creation remove DUMP="-XX:+HeapDumpOnOutOfMemoryError"
line from JMeter startup script.