It looks like we have hit a bug in RCurl. The method getURL seems to be leaking memory. A simple test case to reproduce the bug is given here:
library(RCurl)
handle<-getCurlHandle()
range<-1:100
for (r in range) {x<-getURL(url="news.google.com.au",curl=handle)}
If I run this code, the memory allocated to the R session is never recovered.
We are using RCurl for some long running experiments and we are running out of memory on the test system.
The specs of our test system are as follows:
OS: Ubuntu 14.04 (64 bit)
Memory: 24 GB
RCurl version: 1.95-4.3
Any ideas about how to get around this issue?
Thanks
See if getURLContent()
also exhibits the problem, i.e. replace getURL()
with getURLContent()
.
The function getURLContent()
is a richer version of getURL()
and one that gets more attention.