Search code examples
phpamazon-s3curl-multi

AWS S3 batch upload from localhost php error


I am trying to batch/bulk upload from localhost (xampp) to my S3 bucket.
It seems to work for about 6 items then i get an error message:

The cURL error says Failed sending network data. from http://curl.haxx.se/libcurl/c/libcurl-errors.html

Fatal error: Uncaught exception 'cURL_Multi_Exception' with message 'cURL resource: Resource id #34; cURL error: SSL_write() returned SYSCALL, errno = 0 (cURL error code 55). See http://curl.haxx.se/libcurl/c/libcurl-errors.html for an explanation of error codes.' in D:\xampp\htdocs\path\to\my\files\sdk-1.5.14\lib\requestcore\requestcore.class.php on line 902

and

cURL_Multi_Exception: cURL resource: Resource id #34; cURL error: SSL_write() returned SYSCALL, errno = 0 (cURL error code 55). See http://curl.haxx.se/libcurl/c/libcurl-errors.html for an explanation of error codes. in D:\xampp\htdocs\path\to\my\files\sdk-1.5.14\lib\requestcore\requestcore.class.php on line 902

heres my php. It gets a list of images from a directory and from that loop I wish to batch upload those items to S3.

require_once('sdk-1.5.14/sdk.class.php');
$s3 = new AmazonS3();
//$s3->disable_ssl_verification(); //this didnt fix it

$folder = "./../"; 
$handle = opendir($folder); 

# Making an array of files in a directory to upload to S3
while ($file = readdir($handle)) 
{ 
        $files[] = $file; 
} 
closedir($handle);

foreach ($files as $file) { 
        $path_parts = pathinfo($file);
        if(isset($path_parts['extension']) && $path_parts['extension'] != '') {

                // local path
                $fileTempName = "D:/xampp/htdocs/path/to/my/files/";

                //batch
                $response = $s3->batch()->create_object('bucketname', "tempdirectory/" . $file, array(
                        'fileUpload' => fopen($fileTempName . $file, 'r'),
                        'acl' => AmazonS3::ACL_PUBLIC
                ));

        }

}
$s3->batch()->send();

update: after making changes to congig.inc.php i am now getting error messages:

Fatal error: Uncaught exception 'cURL_Multi_Exception' with message 'cURL resource: Resource id #149; cURL error: Failed connect to mybucket.s3.amazonaws.com:443; No error (cURL error code 7). See http://curl.haxx.se/libcurl/c/libcurl-errors.html for an explanation of error codes.' in D:\xampp\htdocs\sdk-1.5.14\lib\requestcore\requestcore.class.php on line 902

cURL_Multi_Exception: cURL resource: Resource id #149; cURL error: Failed connect to prayerbucket.s3.amazonaws.com:443; No error (cURL error code 7). See http://curl.haxx.se/libcurl/c/libcurl-errors.html for an explanation of error codes. in D:\xampp\htdocs\sdk-1.5.14\lib\requestcore\requestcore.class.php on line 902


Solution

  • Try set limit for batch:

    $batch = new CFBatchRequest(2); // only two instance at once
    
    foreach ($files as $file) { 
            $path_parts = pathinfo($file);
            if(isset($path_parts['extension']) && $path_parts['extension'] != '') {
    
                    // local path
                    $fileTempName = "D:/xampp/htdocs/path/to/my/files/";
    
                    // if batch, it have to return curl's resource
                    $curl_handler = $s3->batch($batch)->create_object('bucketname', "tempdirectory/" . $file, array(
                            'fileUpload' => fopen($fileTempName . $file, 'r'),
                            'acl' => AmazonS3::ACL_PUBLIC
                    ));
    
            }
    
    }
    // batch object send queue
    $batch->send();