I have many file chunks and I need to merge them using PHP fopen function.However, I'm worring about the memory usage.
For example,I got about 100 files listed in split_hash.txt
,each is about 100mb. And here I combine them together:
<?php
$hash = file_get_contents("split_hash.txt");
$list = explode("\r\n",$hash);
$fp = fopen("hadoop2.zip","ab");
foreach($list as $value){
if(!empty($value)) {
$handle = fopen($value,"rb");
fwrite($fp,fread($handle,filesize($value)));
fclose($handle);
unset($handle);
}
}
fclose($fp);
echo "ok";
Will it cost a lot of my memory?
Will it cost a lot of my memory?
it will if you use fread($handle,filesize($value))
to get the whole length of the file for your fread, use fread in smaller chunks per file.
I would change:
fwrite($fp,fread($handle,filesize($value)));
to:
while (!feof($handle)) {
fwrite($fp,fread($handle,1048576));
}
so that you are only dealing with 10 megabytes at a time