Search code examples
amazon-web-servicesaws-batch

how to pass environment variables to an aws batch job definition


I want to define Environment Variables for an aws batch job I am running. Different ENV variables trigger certain parts of the code. When I add them in the Batch console they are not passed to the job definition container properties. I pass the variables in the same form that I am passing them into my Dockerfile: NAME=VALUE. Below is the output I get after both creating a new definition and creating a revision to an existing definition.

"containerProperties": { "image": "docker-image", "vcpus": 2, "memory": 2000, "command": [], "volumes": [], "environment": [], "mountPoints": [], "ulimits": [] }

Is there a special syntax thats used for this?


Solution

  • You need a lambda function to start job. Python code looks like :

    import boto3
    
    client = boto3.client('batch')
    
    def lambda_handler(event, context):
      response = client.submit_job(
       jobName='some-job',
       jobQueue='some-Queue',
       jobDefinition='some-job-definition:1') # 1 is a version
    

    environment variables will be added in client = boto3.client('batch') from environment