I wanted to install an argo workflow template and workflow cron job as a helm chart. helm install command says the chart is installed. But I see only workflow template got deployed and cron job isnt.
Folder structure:
Argo_helm
|- templates
|-azure-migration-cron-etl.yaml
|-azure-migration-etl-template.yaml
|-Chart.yaml
|-values.yaml
When executed helm install command, I see as follows
helm upgrade --install argo-helm-charts --values values.yaml .
WARNING: Kubernetes configuration file is group-readable. This is insecure. Location: /Users/e192270/.kube/config
WARNING: Kubernetes configuration file is world-readable. This is insecure. Location: /Users/e192270/.kube/config
Release "argo-helm-charts" does not exist. Installing it now.
NAME: argo-helm-charts
LAST DEPLOYED: Mon Feb 8 11:28:01 2021
NAMESPACE: argo
STATUS: deployed
REVISION: 1
TEST SUITE: None
When I list template I can see it
$ argo template list
NAME
test-azure-migration
But unable to see cron installed as a part of chart
argo cron list -n argo
NAME AGE LAST RUN NEXT RUN SCHEDULE SUSPENDED
==========================================================================
cron code (azure-migration-cron-etl.yaml)
apiVersion: argoproj.io/v1alpha1
kind: CronWorkflow
metadata:
name: {{ .Values.workflow_name }}
namespace: {{ .Values.workflow_namespace }}
labels:
workflows.argoproj.io/controller-instanceid: fp
spec:
schedule: "0 1 * * *"
timezone: "America/Chicago"
workflowSpec:
entrypoint: sparkdatabricks-app
metrics:
prometheus:
- name: dim_bud_etl_duration
labels:
- key: team
value: foundational-projects-data-eng
help: "Duration of dim_bud etl"
gauge:
value: "{{`{{workflow.duration}}`}}"
serviceAccountName: argo-s3-service-account
volumes:
- name: bearer-token
secret:
secretName: databricks-bearer-token-ophie
templates:
- name: sparkdatabricks-app
steps:
- - name: create-databricks-cluster
templateRef:
name: azure-migration-template
template: databricks-api
arguments:
parameters:
- name: databricks-api-command
........
.......
template code (azure-migration-etl-template.yaml)
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: {{ .Values.workflow_template_name}}
namespace: argo
spec:
entrypoint: databricks-api
serviceAccountName: fp-argo-workflow
arguments:
parameters:
- name: databricks-api-command
value: ""
templates:
- name: databricks-api
inputs:
parameters:
- name: databricks-api-command
container:
image: basic-ubuntu:1.0.0
imagePullPolicy: Always
command: [sh,-c]
args: [{{`"{{inputs.parameters.databricks-api-command}}"`}}]
env:
- name: BEARER
valueFrom:
secretKeyRef:
name: databricks-bearer-token-ophie
key: bearer
- name: ACCOUNT
valueFrom:
secretKeyRef:
name: databricks-bearer-token-ophie
key: account
volumeMounts:
- name: bearer-token
mountPath: "/secret/mountpath"
outputs:
parameters:
- name: id # can be any id depends on what databricks api call it is
valueFrom:
path: /tmp/info.txt
values.yaml
workflow_name: v-dim-bud-test
workflow_namespace: argo
workflow_template_name: test-azure-migration
Argo allows you to scale vertically by adding multiple workflow controllers. Each controller gets an "instance ID."
Your CronWorkflow specifies the fp
workflow controller instance.
apiVersion: argoproj.io/v1alpha1
kind: CronWorkflow
metadata:
name: {{ .Values.workflow_name }}
namespace: {{ .Values.workflow_namespace }}
labels:
workflows.argoproj.io/controller-instanceid: fp
To list CronWorkflows managed by the fp
workflow controller instance, us the --instanceid
parameter:
$ argo cron list -n argo --instanceid fp
NAME AGE LAST RUN NEXT RUN SCHEDULE SUSPENDED
v-dim-bud-test 3m N/A 12h 0 1 * * * false