a file in S3 bucket triggers cloud watch event ( I am able to capture the url and key via $.detail.xxxx
Code below
How can I then pass these to a step function and from step function pass them to fargate instance as an environment variable trying to use terraform's "aws_cloudwatch_event_target" however, I cannot find good examples of launching and passing inputs to step function
Here is the full function i have so far
resource "aws_cloudwatch_event_target" "cw-target" {
arn = aws_sfn_state_machine.my-sfn.arn
rule = aws_cloudwatch_event_rule.cw-event-rule.name
role_arn = aws_iam_role.my-iam.arn
input_transformer {
input_paths = {
bucket = "$.detail.requestParameters.bucketName"
}
}
input_template = <<TEMPLATE
{
"containerOverrides": [
{
"environment": [
{ "name": "BUCKET", "value": <bucket> },
]
}
]
}
TEMPLATE
}
on the cloudwatch event via the console I can see
{"bucket":"$.detail.requestParameters.bucketName"}
and
{
"containerOverrides": [
{
"environment": [
{ "name": "BUCKET", "value": <bucket> },
]
}
]
}
Just need to know how do I fetch this information inside the step function and then send it as ENV var when calling fargate
For using input transformers in AWS eventbridge, check this guide.
You can transform the payload of the event to your liking into the step function by using an InputPath
(as you have already done) and an input template
, where you use variables defined in you InputPath to define a new payload. This new payload will be used as input for the step function.
Here are more examples of input paths and templates: https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-transform-target-input.html
Edit: If you want to start a fargate task with these environment variables, your best option is to indeed use the environment Overrides to specify new env variables on each task.
Old edit:
If you want to start a fargate task with these environment variables, you have two options:
2.only use 1 task definition created beforehand, and use env files in that task definition. More info here. Basically what happens is when the task is started the task will fetch a file from s3 and use the values in that file as env vars. Then you step function only has to contain a step to upload a file to s3, en sten start a fargate task using the existing task definition.