I'm trying to use big.matrix objects in my R code, but I'm having trouble saving them to a file using saveRDS
, which is how I normally save objects:
> library(bigmemory)
Loading required package: bigmemory.sri
Loading required package: BH
bigmemory >= 4.0 is a major revision since 3.1.2; please see packages
biganalytics and and bigtabulate and http://www.bigmemory.org for more information.
> x <- big.matrix(5, 2, type="integer", init=0,
+ dimnames=list(NULL, c("alpha", "beta")))
> saveRDS(x, "bigmem-test.RDS")
> y <- readRDS("bigmem-test.RDS")
> y
An object of class "big.matrix"
Slot "address":
<pointer: (nil)>
> print(y[])
*** caught segfault ***
address 0x51, cause 'memory not mapped'
Traceback:
1: .Call("GetMatrixAll", x@address)
2: GetAll.bm(x)
3: .local(x, ...)
4: y[]
5: y[]
Possible actions:
1: abort (with core dump, if enabled)
2: normal R exit
3: exit R without saving workspace
4: exit R saving workspace
Selection: 3
I assume that saveRDS is somehow failing to realize that the big.matrix object is actually a pointer to some other memory, and is effectively just saving a pointer. Is there any way I can work around this?
(I don't really want to use a file-backed big.matrix object because the object I actually want to save is a complex data structure containing one or more big.matrix objects, so then I would need a backing file for each big.matrix contained in the object, and then the object would be serialized to an indeterminate number of files instead of just one.)
But big.memory
objects sit behind an external pointer so that they outside the control of R. That means that you're idea of saving them as RDS objects from R is doomed from the start.
You could cast them to normal objects eating lots of memory and then write as RDS. Otherwise maybe look into filebased.bigmatrix()
?