Search code examples
pythondjangodjango-file-upload

django python cumulus - How to deal with uploading a large number of files to cloud file storage


I have a number of files processed and saved in temp folder on my server and I now want to move them into my default_storage location, (default_storage is set to rackspace cloud files using django-cumulus).

The process begins uploading the files correctly but only manages less then half the files before stopping. My guess is its a memory issue, but I am not sure how to go about solving it. Here is the relevant code:

listing = os.listdir(path + '/images')
listing.sort()

for infile in listing:
    image = open(path + '/images/' + infile, 'r')
    image_loc = default_storage.save(infile, ContentFile(image.read()))

    image.flush()
    image.close()

Just in case it makes a difference my server setup is a rackspace cloud nginx and gunicorn on ubuntu


Solution

  • In the end the answer came in several parts. First I had to add a TIMEOUT setting to cumulus (which is not mentioned in the django-cumulus documentation). Second I increased timeout for gunicorn. Finally I increased the timeout parameter of nginx.