I have a setup where I am using CodeCommit as my repository to store lambda functions and CodePipeline using AWS SAM to deploy and create lambda functions.
I would like to deploy the lambda functions into different environments such as QA, staging, and Prod. I have used the AWS Parameters store to reference my variables.
Below is my template.yaml file that I have set up that creates a lambda function and it uses AWS parameter store to reference the vairables
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: Test
Parameters:
BucketName:
Description: 'Required. Bucket Name'
Type: 'AWS::SSM::Parameter::Value<String>'
Default: 'MyBucketname'
CSVPath:
Description: 'Required. Configkey Name'
Type: 'AWS::SSM::Parameter::Value<String>'
Default: 'MyCSVPath'
Resources:
GetOrdersFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: ./LambdaCode
Handler: app.lambda_handler
FunctionName: app
Runtime: python3.6
Description: 'staging'
Environment:
Variables:
BucketName: !Ref BucketName
CSVPath: !Ref CSVPath
Events:
HelloWorld:
Type: Api
Properties:
Path: /orders
Method: get
I am able to define variables in my template.yaml for deployment but I am not sure how I can define it for different environments (prod or qa).
When the pipeline triggers it should deploy to QA environment using QA variables and deploy to prod using prod variables which will be defined in AWS Parameter store
What changes should I make in my template.yaml file to enable deploying to different environments?
As Meir has mentioned, You can use parameters and condition functionality in cloudformation to do that, for example, you will add a parameter section as follow:
Parameters:
Stage:
Type: String
Default: staging
Description: Parameter for getting the deployment stage
then a mapping section with a map to hold the environment variables for all your stages
Mappings:
StagesMap:
staging:
CONFIG_BUCKET: staging-bucket-name
CONFIG_KEY: source-data-key-path
prod:
CONFIG_BUCKET: prod-bucket-name
CONFIG_KEY: source-data-key-path
then your function can use the variables based on which environment you are in:
AWSTemplateFormatVersion: '2010-09-09'
Transform: 'AWS::Serverless-2016-10-31'
Description: CD Demo Lambda
Resources:
CDDemoLambda:
Type: 'AWS::Serverless::Function'
Properties:
Handler: lambda_function.lambda_handler
Runtime: python3.6
CodeUri: ./LambdaCode
FunctionName: ApigatewayLambda
AutoPublishAlias: ApiLambda
Description: 'Lambda function validation'
MemorySize: 128
Timeout: 30
Events:
ApiEvent:
Type: Api
Properties:
Path: /getazs
Method: get
Environment:
Variables:
CONFIG_BUCKET: !FindInMap
- StagesMap
- Ref: Stage
- CONFIG_BUCKET
CONFIG_KEY: !FindInMap
- StagesMap
- Ref: Stage
- CONFIG_KEY
Now when you are calling your sam to deploy command, you need to define which stage you are deploying to. ex:
sam deploy --parameter-overrides Stage=prod
Your complete cloudformation template should look like this:
AWSTemplateFormatVersion: '2010-09-09'
Transform: 'AWS::Serverless-2016-10-31'
Description: CD Demo Lambda
Parameters:
Stage:
Type: String
Default: staging
Description: Parameter for getting the deployment stage
Mappings:
StagesMap:
staging:
CONFIG_BUCKET: staging-bucket-name
CONFIG_KEY: source-data-key-path
prod:
CONFIG_BUCKET: prod-bucket-name
CONFIG_KEY: source-data-key-path
Resources:
CDDemoLambda:
Type: 'AWS::Serverless::Function'
Properties:
Handler: lambda_function.lambda_handler
Runtime: python3.6
CodeUri: ./LambdaCode
FunctionName: ApigatewayLambda
AutoPublishAlias: ApiLambda
Description: 'Lambda function validation'
MemorySize: 128
Timeout: 30
Events:
ApiEvent:
Type: Api
Properties:
Path: /getazs
Method: get
Environment:
Variables:
CONFIG_BUCKET: !FindInMap
- StagesMap
- Ref: Stage
- CONFIG_BUCKET
CONFIG_KEY: !FindInMap
- StagesMap
- Ref: Stage
- CONFIG_KEY