Search code examples
pythondjangoamazon-s3django-rest-frameworkdjango-storage

Django use private S3 storage only in production environment


I have set up my django REST API to use local storage when in DEBUG mode and S3 storage when in production environment. This works well for public files, because I override the DEFAULT_FILE_STORAGE like so:

if IS_DEBUG:
    DEFAULT_FILE_STORAGE = 'api.storage_backends.PublicMediaStorage'

and every FileField uses it automatically. Now I want to use private S3 storage the same way, but because I have to define the storage explicitly (FileField(storage=PrivateMediaStorage())), the S3 storage is always used.

How can I use the local storage instead of S3 storage when in DEBUG mode?

PS: I have already thought about changing the model to either use a FileField with or without an explicit storage depending on the DEBUG mode. This did not fully solve my problem, because my migrations are created in DEBUG mode and thus always contain the model without the private storage class.

UPDATE: I am looking for a solution that can share the same migrations in both environments and only during runtime lazily instantiates the actual storageclass. Just like django handles the DEFAULT_FILE_STORAGE already.


Solution

  • It sounds like the tricky part here is having both public and private media storage in a single project.

    The example below assumes you are using django storages, but the technique should work regardless.

    Define a private storage by extending the S3BotoStorage class.

    If using S3, it is probably prudent to store private and public public in different S3 buckets. This custom storage allows you to specify this parameter via settings.

    # yourapp.custom_storage.py
    
    from django.conf import settings
    from django.core.files.storage import get_storage_class
    from storages.backends.s3boto import S3BotoStorage
    
    class S3PrivateStorage(S3BotoStorage):
        """
        Optional   
        """
        default_acl = "private"               # this does the trick
    
        def __init__(self):
            super(S3PrivateStorage, self).__init__()
            self.bucket_name = settings.S3_PRIVATE_STORAGE_BUCKET_NAME
    
    
    # important
    private_storage_class = get_storage_class(settings.PRIVATE_FILE_STORAGE)
    
    private_storage = private_storage_class() # instantiate the storage
    

    The important part is the last 2 lines of this file - it declares private_storage for use in your FileField:

    from yourappp.custom_storage import private_storage
    ...
    class YourModel(Model):
    
        the_file = models.FileField(
                       upload_to=..., 
                       storage=private_storage)
    ...
    

    Finally, in your setting file, something like this should do.

    # settings.py
    
    if DEBUG:
        # In debug mode, store everything on the filestystem
        DEFAULT_FILE_STORAGE = 'django.files.storage.FileSystemStorage'
        PRIVATE_FILE_STORAGE = 'django.files.storage.FileSystemStorage'
    else:
        # In production store public things using S3BotoStorage and private things
        # in a custom storage
        DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
        PRIVATE_FILE_STORAGE = 'yourapp.custom_storage.S3PrivateStorage'
    
    

    As a last piece of unsolicited advice: it is often useful to decouple the storage settings from DEBUG mode and allow all of the parameters above to be specified in environment variables. It is likely that at some point you will want to run your app in debug mode using a production-like storage configuration.