I have a really large table (8M rows) that I need to import in R on which I will be doing some processing. Problem is when I try to bring it in R using the DBI package I get an error
My code is below
options(java.parameters = "-Xmx8048m")
psql.jdbc.driver <- "../postgresql-42.2.1.jar"
jdbc.url <- "jdbc:postgresql://server_url:port"
pgsql <- JDBC("org.postgresql.Driver", psql.jdbc.driver, "`")
con <- dbConnect(pgsql, jdbc.url, user="", password= '')
tbl <- dbGetQuery(con, "SELECT * FROM my_table;")
And the error I get is
Error in .verify.JDBC.result(r, "Unable to retrieve JDBC result set for ", :
Unable to retrieve JDBC result set for SELECT * FROM my_table; (Ran out of memory retrieving query results.)
I can understand its because the result set is too big but I am not sure how to retrieve it by batches instead of all of it together. I have tried using dBSendQuery, dbReadTable and dbGetQuery all of them give the same error.
Any help would be appreciated!
I got it to work by using the RPostgreSQL package instead of the default RJDBC and DBI package. It was able to do a sendQuery and then used fetch recursively to get the data in chunks of 10,000.
main_tbl <- dbFetch(postgres_query, n=-1) #didnt work so tried in chunks
df<- data.frame()
while (!dbHasCompleted(postgres_query)) {
chunk <- dbFetch(postgres_query, 10000)
print(nrow(chunk))
df = rbind(df, chunk)
}