I have a file that's so large I'm unable to read it into a string in one go, but have to use buffering:
$fp = @fopen("bigfile", 'rb');
while (!feof($fp)) {
//process buffer
}
For simplicity, say the file contains a sequence of integer string pairs, where the integer holds the length of the string. Then the code I want to realise in process buffer
, is unpack
an int, read that many characters from the buffer, then repeat.
I appreciate any suggestions in dealing with the scenario where the string spans one buffer to the next. I'm sure that this problem must have been solved and that there is a design pattern for it, I just don't know where to start looking.
Any help would be appreciated.
Not sure if you're looking for an extra-clever solution, but straight forward would be:
while (!feof($fp)) {
$len = fread($fp, 2); // integer-2 bytes ...?
// <--- add checks here len($len)==2 and so on...
$len = unpack('S', $len); // pick the correct format character from http://docs.php.net/function.pack
while(!feof($fp) && $len) {
$cbRead = $len < MAX_CHUNK_LEN ? $len : MAX_CHUNK_LEN;
$buf = fread($fp, $cbRead);
// <--- add checks here len($buf)==$cbRead and so on...
$len -= $cbRead;
// ... process buf
}
if ( $len!=0 ) {
errorHandler();
}
else {
processEndOfString();
}
}