Search code examples
djangoherokuamazon-s3cloudamqp

Celery worker not able to access files created by Django application on Heroku


My Django application is running on Heroku. Ultimately, I would like to store original and processed image files over on Amazon S3. My first version of the code worked by storing all the files over on S3, but occasionally ran into the Heroku 30 sec Application Error message, meaning that the round trip for the request took longer than 30 sec. I modified the code to upload and process the files on the dyno's local/ephemeral file system and use a celery worker with CloudAMQP to transfer the files in the background to S3. My application is successfully uploading and processing the files. Other parts of the application read the files, so I know they are getting written. However, for some reason the worker is not able to see the files. I am getting the Heroku log error: app[scheduler.1]: ... [Errno 2] No such file or directory: u'/app/media/images/....

Should the celery worker be able to see the ephemeral file system? Might I be missing some configuration step on Heroku?

Thanks for any observations or comments you can provide.


Solution

  • I have found the answer to my question. Because the web application and workers are run on different Heroku dynos, they each have their own separate ephemeral file system.