Search code examples
phplaravelamazon-s3php-curl

Laravel backup error uploading large backup to s3


I have a Laravel project that creates a new backup daily using spatie/laravel-backup and uploads it to s3. It is properly configured, and it has been working for over a year without a problem.

Suddenly, the backup can't complete the upload process because of the following error:

Copying zip failed because: An exception occurred while uploading parts to a multipart upload. The following parts had errors:
- Part 17: Error executing "UploadPart" on "https://s3.eu-west-1.amazonaws.com/my.bucket/Backups/2019-04-01-09-47-33.zip?partNumber=17&uploadId=uploadId"; AWS HTTP error: cURL error 55: SSL_write() returned SYSCALL, errno = 104 (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)  (server): 100 Continue -
- Part 16: Error executing "UploadPart" on "https://s3.eu-west-1.amazonaws.com/my.bucket/Backups/2019-04-01-09-47-33.zip?partNumber=16&uploadId=uploadId"; AWS HTTP error: Client error: `PUT https://s3.eu-west-1.amazonaws.com/my.bucket/Backups/2019-04-01-09-47-33.zip?partNumber=16&uploadId=uploadId` resulted in a `400 Bad Request` response:
<?xml version="1.0" encoding="UTF-8"?>
<Code>RequestTimeout</Code><Message>Your socket connection to the server w (truncated...)
 RequestTimeout (client): Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed. - <?xml version="1.0" encoding="UTF-8"?>
<Code>RequestTimeout</Code>
<Message>Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed.</Message>
<RequestId>RequestId..</RequestId>
<HostId>Host id..</HostId>

I tried running:

php artisan backup:run --only-db // 110MB zip file
php artisan backup:run --only-files // 34MB zip file

And they both work properly. My guess is that the error is caused by the full zip size (around 145MB), which would explain why it never occurred before (when the backup size was smaller). The laravel-backup package has a related issue, but I don't think it is a problem of the library, which just uses the underlying s3 flysystem interface to upload the zip.

Is there some param I should set on php.ini (e.g. to increase the curl upload file size), or a system to separate the file on multiple chunks?


Solution

  • You can try adding the timeout parameter in the S3Client (https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/guide_configuration.html)

    Like this:

    $s3 = new Aws\S3\S3Client([
        'version'     => 'latest',
        'region'      => 'us-west-2',
        'credentials' => $credentials,
        'http'        => [
            'timeout' => 360
        ]
    ]);
    

    But in Laravel, you should do it in config/filesystems.php like this:

    'disks' => [
       's3' => [
          'driver' => 's3',
          'key'    => env('AWS_ACCESS_KEY_ID'),
          'secret' => env('AWS_SECRET_ACCESS_KEY'),
          'region' => 'us-east-1',
          'bucket' => env('FILESYSTEM_S3_BUCKET'),
          'http'   => [
             'timeout' => 360
          ]  
       ]
    ]