Search code examples
amazon-web-servicesamazon-s3aws-batchaws-permissions

"An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied" when using batch jobs


  • I have a compute environment with 'ecsInstanceRole'. It contains the policies 'AmazonS3FullAccess' and 'AmazonEC2ContainerServiceforEC2Role'
  • Since I am using the AmazonS3FullAccess policy, I assume the batch job has permission to list, copy, put etc. -The image I am using is a custom docker image that has a startup script which uses "aws s3 ls <S3_bucket_URL>"
  • When I start this image on an EC2 instance, it runs fine and lists the contents of the bucket
  • when I do the same as a batch job, I get the access denied error seen above.

I dont understand how this is happening.

Things I have tried so far:

  • having the bucket policy as

.

{
    "Version": "2012-10-17",
    "Id": "Policy1546414123454",
    "Statement": [
        {
            "Sid": "Stmt1546414471931",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::<Account Id>:root"
            },
            "Action": [
                "s3:ListBucket",
                "s3:ListBucketVersions"
            ],
            "Resource": [
                "arn:aws:s3:::"bucketname",
                "arn:aws:s3:::bucketname/*"
            ]
        }
    ]
}
  • Granted public access to the bucket

Solution

  • Quoting the reply from @JohnRotenstein because I cannot mark it as an answer.

    "If you are using IAM Roles, there is no need for a Bucket Policy. (Also, there is a small typo in that policy, before bucketname but I presume that was due to a Copy & Paste error.) It would appear that a role has not been assigned to your ECS task: IAM Roles for Tasks - Amazon Elastic Container Service"

    Solution: I had toattach an S3 access policy to my current Job Role.