Search code examples
rbigdataaggregater-rastervelox

Aggregate high resolution (300m*300m) raster (raster::aggregate and velox not able to handle well this resolution)


I'm trying to aggregate a raster r of global extent from a ~300m*300m (10 arc‐seconds, 7.4GB) resolution to a ~10km resolution (0.083333 decimal degrees), i.e. a factor of 30. Both the aggregate functions from the raster and the velox packages do not seem to handle such large dataset. I very much welcome recommendations!

# sample rasters
r <- raster(extent(-180,180, -90 , 90))
res(r)<-c(0.5/6/30, 0.5/6/30)
r <- setValues(r, runif(ncell(r))) # Error: cannot allocate vector of size 62.6 Gb

# velox example
devtools::install_github('hunzikp/velox')
library(velox)
vx <- velox(r) # the process aborts in linux
vx$aggregate(factor=30, aggtype='mean')

# raster example
r_agg <- aggregate(r, fact=30)

Solution

  • You say that raster cannot handle a large raster like that, but that is not true. The problem is that you are trying to create a very large data set in memory --- more memory than your computer has available. You can use the init function instead. I show that below but not using a global 300 m raster to make the example run a bit faster.

    library(raster)
    r <- raster(ymn=80, res=0.5/6/30)
    r <- init(r, "col")
    r_agg <- aggregate(r, fact=30)
    

    You get better mileage with terra

    library(terra)
    rr <- rast(ymin=80, res= 0.5/6/30)
    rr <- init(rr, "col")
    rr_agg <- aggregate(rr, fact=30)