I'm trying to manage large datasets or files (e.g., GeoTIFF) on plain R (via terminal) or RStudio (1.4.1106), but both apps crash every time on Linux (Manjaro, x64, core i7, and 8 GB RAM) for some scripts (especially when a raster data is plotted using ggplot2 to produce a high-quality map, as well as a lmer function with random factors using a csv file with ~3000 rows and 6 columns). Probably the issue refers to memory management since all the memory is consumed. To overcome, I tried two packages to limit/increase the memory size, such as "unix" and "RAppArmor". However, if the memory size is limited, all your available RAM was exhausted and the famous message "cannot allocate a vector..." is shown. On the other hand, if the memory size is increased to high levels, R/RStudio simply crashes. On Windows, the following code works like a charm to increase memory size (only needed to plot a raster into ggplot2):
if(.Platform$OS.type == "windows") withAutoprint({
memory.size()
memory.size(TRUE)
memory.limit()
})
memory.limit(size=56000)
However, this function does not work for Linux systems. As stated before, I used the two functions below to manage the RAM memory on Manjaro:
library(ulimit)
memory_limit(10000)
Or
library(RAppArmor)
rlimit_as(1e10)
Please find below a reproducible code similar to mine, including raster properties. The first six lines are used on Windows to increase memory:
#if(.Platform$OS.type == "windows") withAutoprint({
# memory.size()
# memory.size(TRUE)
# memory.limit()
#})
#memory.limit(size=56000)
library(rgdal)
library(raster)
library(tidyverse)
library(sf)
library(rnaturalearth)
library(rnaturalearthdata)
library(viridis)
library(ggspatial)
test <- raster(nrows = 8280, ncols = 5760, xmn = -82, xmx = -34, ymn = -57, ymx = 12)
vals <- 1:ncell(test)
test <- setValues(test, vals)
test
names(test)
testpts <- rasterToPoints(test, spatial = TRUE)
testdf <- data.frame(testpts)
rm(testpts, test)
str(testdf)
polygons_brazil <- ne_countries(country = "brazil", scale = "medium", returnclass = "sf")
plot(polygons_brazil)
polygons_southamerica <- ne_countries(country = c("argentina", "bolivia", "chile", "colombia", "ecuador", "guyana", "paraguay", "peru", "suriname", "uruguay", "venezuela"), scale = "medium", returnclass = "sf")
plot(polygons_southamerica)
polygons_ocean <- ne_download(type = "ocean", category = "physical", returnclass = "sf")
plot(polygons_ocean)
# R crashes after this point (ggplot2 is processed by some time)
map <- ggplot() +
geom_raster(data = testdf , aes(x = x, y = y, fill = layer), show.legend = TRUE) +
geom_sf(data = polygons_ocean, color = "transparent", lwd = 0.35, fill = "white", show.legend = FALSE) +
geom_sf(data = polygons_brazil, color = "darkgray", lwd = 0.35, fill = "transparent", show.legend = FALSE) +
geom_sf(data = polygons_southamerica, color = "darkgray", lwd = 0.35, fill = "gray88", show.legend = FALSE) +
scale_fill_viridis(breaks = c(1, 11923200, 23846400, 35769600, 47692800), limits = c(1, 47692800)) +
guides(fill = guide_colorbar(keyheight = 6, ticks = FALSE, title = bquote(delta^18 *O))) +
ylab("Latitude") +
xlab("Longitude") +
coord_sf(xlim = c(-76, -28), ylim = c(-36, 8), expand = FALSE) +
theme(axis.text.y = element_text(size = 10, color = "black"),
axis.text.x = element_text(size = 10, color = "black"),
axis.title.y = element_text(size = 10, color = "black"),
axis.title.x = element_text(size = 10, color = "black"),
legend.title = element_text(size = 10),
legend.text = element_text(size = 9.5),
legend.box = "vertical",
panel.background = element_rect(fill = "white"),
panel.grid.major = element_line(color = "gray96", size = 0.50),
panel.grid.minor = element_line(color = "gray96", size = 0.30),
axis.line = element_line(color = "black", size = 0.5),
panel.border = element_rect(color = "black", fill = NA, size = 0.5)) +
annotation_scale(location = "br") +
annotation_north_arrow(location = "br", which_north = "true",
pad_x = unit(0, "cm"), pad_y = unit(0.8, "cm"),
style = north_arrow_fancy_orienteering)
map
ggsave("test.png", width = 9, height = 6, units = "in", dpi = 300)
Could anyone help me with this?
With the help of a member from another forum (https://community.rstudio.com/t/out-of-memory-on-r-using-linux-but-not-on-windows/106549), I found the solution. The crash was a result of memory limitation in the swap partition, as speculated earlier. I increased my swap from 2 Gb to 16 Gb and now R/RStudio is able to complete the whole script. It is a quite demanding task since all of my physical memory is exhausted and nearly 15 Gb of the swap is eaten.