Search code examples
phpfwrite

How to write to different file when file size becomes bigger than 50gb


My current code:

$fileone= fopen("fileone.sql", "r"); //opens file fileone.sql
$fileonewrite = fopen("fileone.rdf", "w"); //this is the file to write to

$fileNum=1;
$i=0;
while (!feof($fileone) ) { //feof = while not end of file


    if ($contents = fread($fileonewrite ,53687091200));  //if file is more than 50gb, write to new file (below) .. this doesnt seem to work properly
    {    file_put_contents('fileone'.$fileNum.'.rdf',$contents);
    $fileNum++; 
    }

    $fileoneRow[] = fgets($fileone);  //fgets gets line
    $fileoneParts = explode("\t", $fileoneRow[$i]); //explode using tab delimiter 



    fwrite( " lots of stuff" );
     unset($fileoneParts);
    }
    fclose($fileonetype);   
    fclose($fileonewrite);

Im reading lots of data and outputting even more, the file created easily gets upto >200GB. This causes a memory problem. So what i would like to do, is when the file being written, e.g, fileone.rdf, gets to 50gb, i want to start writing to filetwo. My code atm, doesnt work quite well as it seems to output thousands of empty files.

Thanks for reading my query, any help, as always is much appreciated.


Solution

  • if ($contents = fread($fileonewrite ,53687091200));  //if file is more than 50gb, write  to new file (below) .. this doesnt seem to work properly
                                                      ^----BUG BUG BUG
    {    file_put_contents('fileone'.$fileNum.'.rdf',$contents);
    $fileNum++; 
    }
    

    That ; terminates the if() statement there, so the code inside the {} is NOT part of if(); If the fread returns no data (eof), you'd still be writing an empty $contents out of the file, regardless of how the if() test came out.

    reading 50gigs of data in one go is simply insane. Why not something more along these lines:

    $in = fopen('input');
    $out = fopen('output');
    $read = 0;
    while($data = fread($fh, 1024*1024)) { // read in 1meg chunks 
        fwrite($out, $data);
        $read += strlen($data);
        if ($read > 50gig) {
            fclose($out);
            $out = fopen('new file goes here');
            $read = 0;
        }
    }
    

    that'll do the copying in 1meg chunks, which would place FAR FAR FAR less pressure on system memory, and then swap to a new file whenever you have finally reached 50gig copied.