I'm looking to transfer some data from a multi-million row table into another table. The problems I'm facing are:
INSERT INTO ... SELECT ...
because putting all those rows into a temp table crashes the serverSo I'm basically thinking I should do a stored procedure or something similar, which takes 100 rows at a time and inserts them in the other table.
Any ideas of solutions/best practices in this matter?
Thanks,
You could use SELECT INTO OUTFILE :
SELECT * INTO OUTFILE 'backup.csv'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM <tablename> WHERE <conditions> <Other Clause if required>
Then insert :
LOAD DATA INFILE 'backup.csv'
INTO TABLE <tablename>
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
(field1,field1,field2) etc