I have a list of s3 object that I want to copy from one bucket to another. This is the scenario:
Object int Bucket 1 | Object in Bucket 2 |
---|---|
s3:/bucket1/path/to/object1 | s3:/bucket2/different/path/{date}/{mode}/object1 |
s3:/bucket1/path/to/object2 | s3:/bucket2/different/path/{date}/{mode}/object2 |
s3:/bucket1/path/to/object3 | s3:/bucket2/different/path/{date}/{mode}/object3 |
I can not use the option --recursive
as I'm changing completely the objects keys. I'm executing one aws s3 cp
per file and it's pretty slow. Is there a better way to do it?
You can run 3x that command in parallel on your computer by adding a &
at the end of the command:
aws s3 cp s3:/bucket1/path/to/object1 s3:/bucket2/different/path/{date}/{mode}/object1 &
aws s3 cp s3:/bucket1/path/to/object2 s3:/bucket2/different/path/{date}/{mode}/object2 &
aws s3 cp s3:/bucket1/path/to/object3 s3:/bucket2/different/path/{date}/{mode}/object3 &
This will run the commands 'in the background' and at the same time rather than one-at-a-time. The messages on your screen will be a bit overlapping and confusing, but it will work.