Hi I have a large csv file with 130.000 rows
I use laravel excel 3.1 and lavaravel 5.8
Import class:
<?php
namespace App\Imports;
use App\UsoSuelo;
use Maatwebsite\Excel\Concerns\ToModel;
use Maatwebsite\Excel\Concerns\WithHeadingRow;
use Maatwebsite\Excel\Concerns\WithChunkReading;
use Maatwebsite\Excel\Concerns\WithBatchInserts;
class UsoSueloImport implements ToModel, WithHeadingRow, WithChunkReading, WithBatchInserts
{
/**
* @param array $row
*
* @return \Illuminate\Database\Eloquent\Model|null
*/
public function model(array $row)
{
return new UsoSuelo([
'cod_pais' => $row['cod_pais'],
'cod_fundo' => $row['cod_fundo'],
'nom_fundo' => $row['nom_fundo'],
]);
}
public function batchSize(): int
{
return 1000;
}
public function chunkSize(): int
{
return 1000;
}
}
And I use a trait class from my controller:
trait storeTrait{
public function storeUsoSuelo($archivo) {
Excel::import(new UsoSueloImport,$archivo);
}
public function storeFundo($archivo) {
Excel::import(new FundosImport,$archivo);
}
public function storeFundoGrilla($archivo) {
Excel::import(new FundosGrillasImport,$archivo);
}
public function storeCuadrante($archivo) {
Excel::import(new CuadrantesImport,$archivo);
}
}
This is my ImportController
class ImportController extends Controller
{
use storeTrait {
storeUsoSuelo as storeUsoSuelos;
storeFundo as storeFundos;
storeFundoGrilla as storeFundoGrillas;
storeCuadrante as storeCuadrantes;
}
public function store(Request $request)
{
$usoSuelo = 'uso_suelo.csv';
$this->storeUsoSuelos($usoSuelo);
$cuadrante = 'cuadrantes.csv';
$this->storeCuadrantes($cuadrante);
$fundo = 'mv_qav_fundos.csv';
$this->storeFundos($fundo);
$fundoGrilla = 'fundos_grilla.csv';
$this->storeFundoGrillas($fundoGrilla);
}
}
I have done tests and my code works with a csv of less than 100 rows but when I try with the 130,000 rows it takes too long, and I end up getting the following error:
"Maximum execution time of 60 seconds exceeded"
And after 1 minute only 4000 rows have been inserted in the database (postgresql)
I put these 2 lines in my controller, at the beginning of the script:
ini_set ('max_execution_time', 3600);
ini_set ('memory_limit', '2048M');
With this I solved it, I also changed the chunk from 1000 to 5000, but it still takes too long, at least 5 minutes