Search code examples
pythonamazon-s3s3-bucket

Copy amazon s3 bucket folder to local


I have this link:

s3://some_path/200_files/*.gz

I have the corresponding ACCESS ID and SECRET KEY. How to copy the complete folder (200_files) OR all the .gz to the local system? Ubuntu CLI or Python based solution. I understand that this is not a up to the mark question, answers in comments would work. Thanks :)


Solution

  • To copy all objects in an S3 bucket to your local machine simply use the aws s3 cp command with the --recursive option.

    See: http://bigdatums.net/2016/09/04/copy-all-files-in-s3-bucket-to-local-with-aws-cli/

    To set the credentials:

    mkdir ~/.aws
    touch credentials
    

    ~/.aws/credentials (sample content)

    [default]
    aws_access_key_id=AKIAIOSFODNN7EXAMPLE
    aws_secret_access_key=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
    

    More config here