<?php
file_put_contents("10gb.zip", fopen("http://website.website/10GB.zip", 'r'));
echo "File Downloaded!";
I am using this code to download files from url to my server. But when I run my code My hosing servers memory turn into red! -_- and my download stuck at 3.79 GB.
Is there any limitation to download big files? i want to download more than 50 GB with 5 process! Is it possible?
i would go for streaming when dealing with large file rather than copying them directly
from the example provided here : http://php.net/manual/en/function.stream-copy-to-stream.php
you can try :
<?php
function pipe_streams($in, $out)
{
while (!feof($in))
fwrite($out,fread($in,8192));
}
pipe_streams("http://website.website/10GB.zip", "10gb.zip");
?>
or use curl
(http://php.net/manual/en/book.curl.php) :
<?php
$url = "http://website.website/10GB.zip";
$path = "10gb.zip";
$fp = fopen($path, 'w');
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_FILE, $fp);
$data = curl_exec($ch);
curl_close($ch);
fclose($fp);
?>
check this https://www.sitepoint.com/performant-reading-big-files-php/ for more streaming options