I want to measure my peak memory usage in R so that I can allocate resources appropriately. The method must include intermediate objects created during the analysis. For example, mx
is a 80-Mb object created on every loop of lapply
but never saved as a global variable. The peak memory usage should be at least 80Mb above baseline.
gc(reset = TRUE)
sum(gc()[, "(Mb)"]) # 172Mb
lapply(1:3, function(x) {
mx <- rnorm(1e7) # 80Mb object
mean(mx)
})
sum(gc()[, "(Mb)"]) # still 172Mb!
I found what I was looking for in the peakRAM
package. From the documentation:
This package makes it easy to monitor the total and peak RAM used so that developers can quickly identify and eliminate RAM hungry code.
mem <- peakRAM({
for(i in 1:5) {
mean(rnorm(1e7))
}
})
mem$Peak_RAM_Used_MiB # 10000486MiB