What is the best approach to combine multiple MySQL tables in R? For instance, I need to rbind
14 large `MySQL tables (each >100k rows by 100 columns). I tried the below approach, which consumed most of my memory and got time out from MySQL. I am wondering if there is alternative solution? I do not need to fetch the whole table, just need group the whole table by a couple of variables and calculate some metrics.
station_tbl_t <- dbSendQuery(my_db, "select * from tbl_r3_300ft
union all
select * from tbl_r4_350ft
union all
select * from tbl_r5_400ft
union all
select * from tbl_r6_500ft
union all
select * from tbl_r7_600ft
union all
select * from tbl_r8_700ft
union all
select * from tbl_r9_800ft
union all
select * from tbl_r10_900ft
union all
select * from tbl_r11_1000ft
union all
select * from tbl_r12_1200ft
union all
select * from tbl_r13_1400ft
union all
select * from tbl_r14_1600ft
union all
select * from tbl_r15_1800ft
union all
select * from tbl_r16_2000ft
")
Consider iteratively importing MySQL table data and then row bind with R. And be sure to select needed columns to save on overhead:
tbls <- c("tbl_r3_300ft", "tbl_r4_350ft", "tbl_r5_400ft",
"tbl_r6_500ft", "tbl_r7_600ft", "tbl_r8_700ft",
"tbl_r9_800ft", "tbl_r10_900ft", "tbl_r11_1000ft",
"tbl_r12_1200ft", "tbl_r13_1400ft", "tbl_r14_1600ft",
"tbl_r15_1800ft", "tbl_r16_2000ft")
sql <- "SELECT Col1, Col2, Col3 FROM"
dfList <- lapply(paste(sql, tbls), function(s) {
tryCatch({ return(dbGetQuery(my_db, s))
}, error = function(e) return(as.character(e)))
})
# ROW BIND VERSIONS ACROSS PACKAGES
master_df <- base::do.call(rbind, dfList)
master_df <- plyr::rbind.fill(dfList)
master_df <- dplyr::bind_rows(dfList)
master_df <- data.table::rbindlist(dfList)