Search code examples
mysqlperformanceload-data-infileinsert

What is the best way to achieve speedy inserts of large amounts of data in MySQL?


I have written a program in C to parse large XML files and then create files with insert statements. Some other process would ingest the files into a MySQL database. This data will serve as a indexing service so that users can find documents easily.

I have chosen InnoDB for the ability of row-level locking. The C program will be generating any where from 500 to 5 million insert statements on a given invocation.

What is the best way to get all this data into the database as quickly as possible? The other thing to note is that the DB is on a separate server. Is it worth moving the files over to that server to speed up inserts?

EDIT: This table won't really be updated, but rows will be deleted.


Solution

    • Use the mysqlimport tool or the LOAD DATA INFILE command.
    • Temporarily disable indices that you don't need for data integrity