We deploy flat files to our web servers using bamboo SCP jobs. I would like to move content from the web servers to S3, so need a Bamboo job to deploy static content to an S3 bucket.
I assumed it would be a 2 min job to make a build plan to deploy flat files to S3, but suspect I'm missing something obvious here, as I can't see how to do it.
First you need to create a "Script" in your build job.
Then export the AWS access keys in your build script:
export AWS_ACCESS_KEY_ID=AKIAJA335522247FF
export AWS_SECRET_ACCESS_KEY=crNwiopyfDWD780wO32hv0cAkmzV65vyA3++No+
After that you can simply iterate over your files and copy them with the aws
command to your desired bucket:
FILES="backups/*"
bucket="s3://my-backups/database/"
for f in $FILES
do
file=`basename $f`
echo "Processing $file"
target=$bucket$file
aws s3 cp $f $target
done
Alternatively you can copy also a folder:
aws s3 cp "my-files/" "s3://my-backups/" --recursive
Or, if you want to be even faster, you can only synchronize the changes:
aws s3 sync "my-files/" "s3://my-backups/"