Search code examples
phplaravelfilestream

How to dynamically set max_execution_time based on file size in php


I have a job in Laravel which downloads .zip files from an ftp-server to the local disk using the Laravel file-system. The sizes of those files vary between a few MB and up to multiple GB.

This is my current code which works fine, as long as I increase the max_execution_time:

public function handle()
{
    ini_set('max_execution_time', 300);
    $destination = storage_path('app/temp/' . $this->import->unique_filename);

    $ftpDisk = Storage::disk($this->import->disk);

    $stream = $ftpDisk
        ->getDriver()
        ->readStream($this->import->filename);

    file_put_contents($destination, stream_get_contents($stream), FILE_APPEND);        
}

Is there a possibility to chunk the downloaded file in smaller pieces? Or another option which is better than setting the max_execution_time to maximum?

Another approach would be to extract the archive on the ftp server and read the content (csv/json and images) file by file.


Solution

  • It's not recommended to change the maximum execution time during the execution of the job. instead, you can dispatch the job into a queue to run time-consuming tasks such as downloading.

    then you can run queue worker command followed by --timeout=0.

    php artisan queue:listen --timeout=0