there is the package: RMySQL
How can I bulk upload lots of data to mysql from R? I have a csv with around 1 million lines and 80 columns.
Would something like this work?
dbWriteTable(con, "test2", "~/data/test2.csv") ## table from a file
I fear this inserts line by line...
If you are inserting many rows from the same client at the same time, You can use INSERT statements with multiple VALUES
INSERT INTO test2(col1, col2, col3, col4)
VALUES
('val1', 'val2', val3, val4),
('val1', 'val2', val3, val4),
('val1', 'val2', val3, val4)
Here an example on how to create your query. I am using data.table
here:
dat <- matrix(seq(4*3), 3, 4)
library(data.table)
DT <- data.table(t(dat))
query <- paste('INSERT INTO test2(col1, col2, col3, col4)\nVALUES\n',
gsub('c','',(DT[,paste(.SD,collapse='\n')])))
cat(query)
INSERT INTO test2(col1, col2, col3, col4)
VALUES
(1, 4, 7, 10)
(2, 5, 8, 11)
(3, 6, 9, 12)
then you can execute it using dbGetQuery
:
dbGetQuery(con, query)