Search code examples
phplaravelherokuamazon-s3

Multiple image upload to s3 using laravel on heroku malfunction


I am trying to loop over a multiple input file select element

<label class="btn btn-success btn-md rounded-1" for="galleryFile">Create Album</label>
<input type="file" name="galleryFile[]" id="galleryFile" class="d-none" multiple accept="image/*">

and save all selected images to AWS s3

public function sendToCloud($file, $folder, $prefix) {
  $filePath = Auth::user()->userid . '/' .$folder.'/' . $prefix;

  $extension = $file->getClientOriginalExtension();
  $filename  = $filePath . time() . '.' . $extension;

  Storage::disk('s3')->put($filename, fopen($file, 'r+'), 'public');
  return Storage::disk('s3')->url($filename);
}

$unikPath = uniqid();
foreach ($request->file('galleryFile') as $key => $file) {
  if ($key == array_key_last($request->file('galleryFile'))) {
    $galleryUrl .= $this->sendToCloud($file, $unikPath . '/gallery', 'img_');
  } else {
    $galleryUrl .= $this->sendToCloud($file, $unikPath . '/gallery', 'img_') . ' | ';
  }
}

On my local machine it works very fine making me believe its not a problem with my code or laravel alone but when I deployed to heroku the upload behavior changed. What it does on heroku is

if I selected 10 images to upload, it randomly picks and saves 2 of those images to s3 and return the s3 url of those 2 images 10 times.

I appreciate any help pointing in the right direction on how to fix this issue.

Thanks in advance.


Solution

  • UPDATED ANSWER

    Heeding the advice from @Peter O, I used bin2hex(random_bytes(7)) to generate unique names for each file. This prevented the images from being overwritten by the next without the need for sleep(1).

    public function sendToCloud($file, $folder, $prefix) {
      $filePath = Auth::user()->userid . '/' .$folder.'/' . $prefix;
    
      $extension = $file->getClientOriginalExtension();
      $filename  = $filePath . bin2hex(random_bytes(7)) . '.' . $extension;
    
      Storage::disk('s3')->put($filename, fopen($file, 'r+'), 'public');
      return Storage::disk('s3')->url($filename);
    }
    

    PREVIOUS ANSWER

    So I was able to fix this. After multiple trials, I realized that all the images were actually being processed and sent to s3 but I could only see two because the speed of processing was making the images overwrite the previous ones with the same name.

    How?

    The PHP uniqid() function generates a unique ID based on the microtime, three images could process at the same time which means they'll all have the same value based of the uniqid() function.

    How I solved it

    I added a sleep function to the foreach loop to delay execution by 1 second allowing the program time to generate a new unique name.

    foreach ($request->file('galleryFile') as $key => $file) {
      if ($key == array_key_last($request->file('galleryFile'))) {
        $galleryUrl .= $this->sendToCloud($file, $unikPath . '/gallery', 'img_');
      } else {
        $galleryUrl .= $this->sendToCloud($file, $unikPath . '/gallery', 'img_') . ' | ';
        sleep(1);
      }
    }
    

    I hope this helps someone someday.