I have managed to upload large files to s3 using multiPart Upload, but I can't download them again using the getObject function. Is there another way I can achieve this?
Here my code:
$keyname= 'key';
$bucket = 'bucketname';
$fileName = 'filename.txt';
$result = $s3->getObject([
'Bucket' => $bucket,
'Key' => $keyname
]);
var_dump($fileName);
$result['ContentDisposition'] = 'attachment; filename="'.$fileName.'"';
$result['fileName'] = $result['ContentDisposition'];
header("Content-Type: {$result['ContentType']}");
header("Content-Disposition: {$result['ContentDisposition']}");
header("Content-Length: {$result['ContentLength']}");
echo $result['Body'];
Thanks For the help. This is my solution:
$keyname= 'key';
$bucket = 'bucketname';
$fileName = 'filename.txt';
#create S3 Client
$s3 = new S3Client([
'version' => 'latest',
'region' => 'eu-central-1',
'credentials' => [
]
]);
$cmd = $s3->getCommand('GetObject', [
'Bucket' => $bucket,
'Key' => $keyname,
'ResponseContentDisposition' => 'attachment; filename="'.$fileName.'"'
]);
$request = $s3->createPresignedRequest($cmd, '+15 min');
$presignedUrl = (string)$request->getUri();
echo $presignedUrl;
after this, I download it in my frontend with an a tag via js
you can create a Presigned connection with S3 like this
$keyname= 'key';
$bucket = 'bucketname';
$fileName = 'filename.txt';
$command = $s3->getCommand('GetObject', array(
'Bucket' => $bucket,
'Key' => $keyname
'ResponseContentDisposition' => 'attachment; filename="'.$fileName.'"'
));
$signedUrl = $command->createPresignedUrl('+15 minutes');
header('Location: '.$signedUrl);