I am writing down the scope of our CI/CD pipeline to be developed using AWS native tools. What do you recommend? discussion I am in the process of finalizing the scope of our CI/CD pipeline that will be using AWS native tools Codepipeline, code build, and such. The basic pipeline boilerplate is written in CDK and we love the choice so far. Now, we would like to define the final scope for it, and here's what we have got so far.
I would love to know what tools/abilities are integrated into your CI/CD pipeline to ensure that we are looking at developing an enterprise-grade CI/CD pipeline.
Pipeline per branch
Build once, deploy many
Cross account deployment i.e Deployment to Different environments (dev/QA/prod) from tools account
Pipeline behavior based on the branch name
Test execution based on the stage/environment
Integrate static code analysis
Manual approval by multiple people before deployment
Identifying Security code vulnerabilities in application source code from the pipeline (May be through Synk)
Identifying AWS cloud formation security tests (May be through SecurityHub)
Allow developers to deploy feature branches in the common sandbox account from the CI/CD
Create dashboards for builds/deployments by sending events from the pipeline to the cloud -watch
Observe alarms when a test fails so that an automatic rollback happens in that case
Observe alarms when a config rule fails so that an automatic rollback happens in that case
Dynamic pipelines per branch based on events
Support pre-view deployment stage
I would love to hear what can be improved/added in the current scope
That is a very complete scope, good work.
I would add an AMI build stage, that would build app specific AMIs using Packer. See
https://github.com/awslabs/ami-builder-packer for a good reference architecture.
I would also consider dynamic operations dashboards, that would generate a updated CloudWatch dashboard based a key/updated resources used in the project.
Consider something like semantic or conventional commit syntax, that would launch dynamic build activities based on human/machine readable tags added to commit messages
Semantic Commits are commit messages with human and machine readable meaning, which follow particular conventions
For example, if you push a commit message with the string build/preview
, the build pipeline would launch a preview deployment on demand. It could take the pr number and make a dynamic url for the app that might persist until the branch is merged. https://nitayneeman.com/posts/understanding-semantic-commit-messages-using-git-and-angular/ for some ideas there.
I didn't see it called out, but unit tests, funcional tests, and api tests should be included in the dynamic application software tests.
Load tests and vulnerability tests can be performed on completed deployments, to ensure each build conforms to established performance or security standards.
Also consider in your pipeline building full infrastructure from code, if you are using Terraform or Cloudformation. Knowing that you can build everything from scratch is a great baseline. Using AWS Organizations you can even create new AWS accounts from scratch, and build your entire infrastructure in a new account.
Docker image security scanning is another important pipeline stage related to container security. Images can be scanned against CVE and other vulnerability lists. See https://docs.docker.com/engine/scan/
I like to add a documentation/report publishing stage, that take project assets and integrate them into online documentation systems. For example, you can use Antora/AsciiDoctor/Netlfy to build a documentation toolchain that would generate HTML, pdf, and Docx files for all project documentation, directly from the project repo, at build time. see https://fedoramagazine.org/using-antora-for-your-open-source-documentation/