Search code examples
argo-workflows

Create custom Argo artifact type


Whenever an S3 artifact is used, the following declaration is needed:

s3:
  endpoint: s3.amazonaws.com
  bucket: "{{workflow.parameters.input-s3-bucket}}"
  key: "{{workflow.parameters.input-s3-path}}/scripts/{{inputs.parameters.type}}.xml"
  accessKeySecret:
    name: s3-access-user-creds
    key: accessKeySecret
  secretKeySecret:
    name: s3-access-user-creds
    key: secretKeySecret

It would be helpful if this could be abstracted to something like:

custom-s3:
  bucket: "{{workflow.parameters.input-s3-bucket}}"
  key: "{{workflow.parameters.input-s3-path}}/scripts/{{inputs.parameters.type}}.xml"

Is there a way to make this kind of custom definition in Argo to reduce boilerplate?


Solution

  • For a given Argo installation, you can set a default artifact repository in the workflow controller's configmap. This will allow you to only specify the key (assuming you set everything else in the default config - if not everything is defined for the default, you'll need to specify more things).

    Unfortunately, that will only work if you're only using one S3 config. If you need multiple configurations, cutting down on boilerplate will be more difficult.

    In response to your specific question: not exactly. You can't create a custom some-keyname (like custom-s3) as a member of the artifacts array. The exact format of the YAML is defined in Argo's Workflow Custom Resource Definition. If your Workflow YAML doesn't match that specification, it will be rejected.

    However, you can use external templating tools to populate boilerplate before the YAML is installed in your cluster. I've used Helm before to do exactly that with a collection of S3 configs. At the simplest, you could use something like sed.

    tl;dr - for one S3 config, use default artifact config; for multiple S3 configs, use a templating tool.