Search code examples
phpfile-handlinglarge-data

PHP - read huge amount of files from directory


I have a folder with huge amount pictures(10000 files at least) and I need to get names of all this files using PHP. The problem is when I use scandir() I got error about memory limit. Also, I tried to use code like this:

          $files = [];
          $dir = opendir($this->path);
          $i = 0;
          while(($file = readdir($dir)) !== false) {
              $files[] = $file;
              $i++;
              if ($i == 100)
                  break;
          }

This code works fine, but it's not what I need. When I try to get all files, script still crashes.

Besides I thought about saving state of pointer in $dir somehow for using it later through AJAX requests and getting all files, but I can't find any solution for that purpose.

Is there any method of set limit and offset for reading files? I mean like pagination.


Solution

  • You can use RecursiveDirectoryIterator with a Generator if memory is a huge issue.

    function recursiveDirectoryIterator($path) {
      foreach (new RecursiveIteratorIterator(new RecursiveDirectoryIterator($path)) as $file) {
        if (!$file->isDir()) {
          yield $file->getFilename() . $file->getExtension();
        }
      }
    }
    
    $start = microtime(true);
    $instance = recursiveDirectoryIterator('../vendor');
    $total_files = 0;
    foreach($instance as $value) {
      // echo $value
      $total_files++;
    }
    echo "Mem peak usage: " . (memory_get_peak_usage(true)/1024/1024)." MiB";
    echo "Total number of files: " . $total_files;
    echo "Completed in: ", microtime(true) - $start, " seconds";
    

    Here's what I got on my not-so-great laptop.

    enter image description here