I have a private s3 Bucket that I've selected to Block *all* public access
on in settings.
when I try to collect static I get an access denied error:
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
Also, when trying to access the static files (i.e. js & css) that I've manually uploaded I get 403
code
I've made a bucket policy based on this post:
{
"Version": "2012-10-17",
"Id": "Policy16144340382381",
"Statement": [
{
"Sid": "Stmt16144340315031",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::634515378440:user/myapp-user"
},
"Action": [
"s3:PutObject",
"s3:AbortMultipartUpload",
"s3:DeleteObject",
"s3:GetObject",
"s3:ListBucketMultipartUploads",
"s3:ListMultipartUploadParts"
],
"Resource": [
"arn:aws:s3:::myapp-bucket",
"arn:aws:s3:::myapp-bucket/*"
]
},
{
"Sid": "Stmt1614408230409",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::634515378440:user/myapp-user"
},
"Action": "*",
"Resource": "arn:aws:s3:::myapp-bucket/*"
}
]
}
I added the IAM user to a group with this permission policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObjectAcl",
"s3:GetObject",
"s3:ListBucketMultipartUploads",
"s3:AbortMultipartUpload",
"s3:PutObjectVersionAcl",
"s3:ListBucket",
"s3:DeleteObject",
"s3:GetBucketLocation",
"s3:PutObjectAcl",
"s3:ListMultipartUploadParts"
],
"Resource": [
"arn:aws:s3:::myapp-bucket/*",
"arn:aws:s3:::myapp-bucket"
]
},
{
"Sid": "VisualEditor1",
"Effect": "Allow",
"Action": "s3:ListAllMyBuckets",
"Resource": "*"
}
]
}
Also, here is the code in settings.py
code that I've used to configure the app
INSTALLED_APPS = [
...
'storages',
...
]
# S3 BUCKETS CONFIG
AWS_ACCESS_KEY_ID = os.environ['AWS_ACCESS_KEY_ID']
AWS_SECRET_ACCESS_KEY = os.environ['AWS_SECRET_ACCESS_KEY']
AWS_STORAGE_BUCKET_NAME = os.environ['AWS_STORAGE_BUCKET_NAME']
AWS_S3_CUSTOM_DOMAIN = f'{AWS_STORAGE_BUCKET_NAME}.s3.amazonaws.com'
AWS_S3_OBJECT_PARAMETERS = {'CacheControl': 'max-age=86400'}
AWS_LOCATION = 'static'
AWS_DEFAULT_ACL = 'public-read'
AWS_S3_FILE_OVERWRITE = True
STATICFILES_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
DEFAULT_FILE_STORAGE = 'myapp.storages.MediaStorage'
Here is storages.py
from storages.backends.s3boto3 import S3Boto3Storage
import os
class MediaStorage(S3Boto3Storage):
location = 'media'
file_overwrite = True
bucket_name = os.environ['AWS_STORAGE_BUCKET_NAME']
I've even tried making new users and buckets and adding them to the policy in case the credentials were wrong, but still no result. Is there a way I can verify that the connection is being made successfully and perhaps list the permissions the user has through the boto3 client?
UPDATE:
I configured my AWS CLI profile using myapp-user
's credentials, and it lets me list the buckets aws s3 ls --profile myapp-user
, I was also able to run the aws s3 cp
command to copy files between my local pc and the bucket just fine.
So I know the credentials are fine; but at this point, I don't know what else I could do. I have made sure that the credentials I used in the AWS CLI config were identical to the ones in my env variables, but I still get the AccessDenied
error.
As Jarmod pointed out in the comments, the reason I was getting access denied
was because I had the AWS_DEFAULT_ACL = 'public-read'
, which clashed with the access settings of the bucket.
I was able to resolve it by having AWS_DEFAULT_ACL = None
instead which simply inherits the bucket's access settings.