Search code examples
djangodjango-storagebotocorepython-django-storages

django-storages EndpointConnectionError


Sorry for the noise but I think I am missing something and I can't find my solution. When running my collectstatic, I get the following error: botocore.exceptions.EndpointConnectionError: Could not connect to the endpoint URL: "http://localhost:1212/test/static/gis/css/ol3.css"

Here is the following setup:

docker-compose.yaml

. . .
  s3server:
    image: scality/s3server:latest
    restart: unless-stopped
    ports:
      - "1212:8000"
    volumes:
      - s3data:/usr/src/app/localData
      - s3metadata:/usr/src/app/localMetadata
    environment:
      SCALITY_ACCESS_KEY_ID: newAccessKey
      SCALITY_SECRET_ACCESS_KEY: newSecretKey
      SSL: "FALSE"

settings.py

# AWS settings
AWS_ACCESS_KEY_ID = env.str('AWS_ACCESS_KEY_ID')
AWS_SECRET_ACCESS_KEY = env.str('AWS_SECRET_ACCESS_KEY')
AWS_S3_REGION_NAME = env.str('AWS_S3_REGION_NAME')
AWS_STORAGE_BUCKET_NAME = env.str('AWS_STORAGE_BUCKET_NAME')
AWS_S3_ENDPOINT_URL = env.str('AWS_S3_ENDPOINT_URL')
AWS_DEFAULT_ACL = None
AWS_S3_OBJECT_PARAMETERS = {
    'CacheControl': 'max-age=86400',
}
AWS_QUERYSTRING_AUTH = False

# s3 static settings
AWS_STATIC_LOCATION = 'static'
STATIC_URL = f'{AWS_S3_ENDPOINT_URL}/{AWS_STATIC_LOCATION}/'
STATICFILES_STORAGE = 'backend.storages.StaticStorage'

# s3 media settings
AWS_MEDIA_LOCATION = 'media'
MEDIA_URL = f'{AWS_S3_ENDPOINT_URL}/{AWS_MEDIA_LOCATION}/'
DEFAULT_FILE_STORAGE = 'backend.storages.PublicMediaStorage'

dev.env

AWS_STORAGE_BUCKET_NAME=test
AWS_ACCESS_KEY_ID=newAccessKey 
AWS_SECRET_ACCESS_KEY=newSecretKey
AWS_S3_REGION_NAME=us-east-1
AWS_S3_ENDPOINT_URL=http://localhost:1212

backend/storages.py

class StaticStorage(S3Boto3Storage):
    location = settings.AWS_STATIC_LOCATION
    default_acl = "public-read"


class PublicMediaStorage(S3Boto3Storage):
    location = settings.AWS_MEDIA_LOCATION
    default_acl = 'public-read'
    file_overwrite = False

I really don't understand why as the following script works just fine:

script.py

import logging
import boto3
from botocore.exceptions import ClientError

s3_client = boto3.client(
    's3',
    aws_access_key_id="newAccessKey",
    aws_secret_access_key="newSecretKey",
    endpoint_url='http://localhost:1212',
    region_name="us-east-1",
)

def create_bucket(bucket_name):
    try:
        s3_client.create_bucket(
            Bucket=bucket_name,
            CreateBucketConfiguration={'LocationConstraint': "us-east-1"},
        )
    except ClientError as e:
        logging.error(e)
        return False
    return True


if __name__ == "__main__":
    create_bucket("test", region="us-east-1")

    response = s3_client.list_buckets()

    # Output the bucket names
    print('Existing buckets:')
    for bucket in response['Buckets']:
        print(f'  {bucket["Name"]}')

    response = s3_client.upload_file(
        "backend/tests/test_image.jpg",
        "test",
        "static/test_image",
    )
    s3_client.download_file('test', 'static/test_image', 'toto.jpg')

Solution

  • Well, inside a container, locahost is obviously not the other services. Change AWS_S3_ENDPOINT_URL=http://localhost:1212 to AWS_S3_ENDPOINT_URL=http://s3server:8000 and expose the 8000 port from s3server in compose. The last step to make it work is to add "s3server": "us-east-1" in the config.json mounted in the scality server.