Search code examples
laravellaravel-4

Best case for seeding large data in Laravel


I have a file with over 30,000 records and another with 41,000. Is there a best case study for seeding this using laravel 4's db:seed command? A way to make the inserts more swift.

Thanks for the help.


Solution

  • Don't be afraid, 40K rows table is kind of a small one. I have a 1 milion rows table and seed was done smoothly, I just had to add this before doing it:

    DB::disableQueryLog();
    

    Before disabling it, Laravel wasted all my PHP memory limit, no matter how much I gave it.

    I read data from .txt files using fgets(), building the array programatically and executing:

    DB::table($table)->insert($row);
    

    One by one, wich may be particularily slow.

    My database server is a PostgreSQL and inserts took around 1.5 hours to complete, maybe because I was using a VM using low memory. I will make a benchmark one of these days on a better machine.