Search code examples
gokubernetesargo-workflowsargoproj

What's the best way to inject a yaml file into an Argo workflow step?


Summary:

We have a golang application that submits Argo workflows to a kubernetes cluster upon requests. I'd like to pass a yaml file to one of the steps and I'm wondering what are the options for doing this.

Environment:

  • Argo: v2.4.2
  • K8s: 1.13.12-gke.25

Additional details:

Eventually, I would like to pass this file to the test step as shown in this example:

apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
  generateName: test-
spec:
  entrypoint: test
  templates:
  - name: test
    container:
      image: gcr.io/testproj/test:latest
      command: [bash]
      source: |
        python test.py --config_file_path=/path/to/config.yaml

The image used in this step would have a python script that receives the path to this file then accesses it.

To submit the Argo workflows with golang, we use the following dependencies:

Thank you.


Solution

  • Option 1: pass the file as a parameter

    Workflow parameters are usually small bits of text or numbers. But if your yaml file is reasonably small, you could string-encode it and pass it as a parameter.

    apiVersion: argoproj.io/v1alpha1
    kind: Workflow
    metadata:
      generateName: test-
    spec:
      entrypoint: test
      arguments:
        parameters:
        - name: yaml
          value: "string-encoded yaml"
      templates:
      - name: test
        container:
          image: gcr.io/testproj/test:latest
          command: [bash]
          source: |
            # In this case, the string-encoding should be BASH-compatible.
            python test.py --config_file_as_string="{{inputs.parameters.message}}"
    

    Option 2: pass the file as an artifact

    Argo supports multiple types of artifacts. Perhaps the simplest for your use case is the raw parameter type.

    apiVersion: argoproj.io/v1alpha1
    kind: Workflow
    metadata:
      generateName: test-
    spec:
      entrypoint: test
      templates:
      - name: test
        inputs:
          artifacts:
          - name: yaml
            path: /path/to/config.yaml
            raw:
              data: |
                this is
                the raw file
                contents
        container:
          image: gcr.io/testproj/test:latest
          command: [bash]
          source: |
            python test.py --config_file_path=/path/to/config.yaml
    

    Besides raw, Argo supports "S3, Artifactory, HTTP, [and] Git" artifacts (among others, I think).

    If, for example, you chose to use S3, you could upload the file from your golang app and then pass the S3 bucket and key as parameters.

    Golang client

    I'm not familiar with the golang client, but passing parameters is certainly supported, and I think passing in a raw parameter should be supported as well.