Search code examples
kubernetesworkflowpipelinejobsdata-science

Kubernetes can analytical jobs be chained together in a workflow?


Reading the Kubernetes "Run to Completion" documentation, it says that jobs can be run in parallel, but is it possible to chain together a series of jobs that should be run in sequential order (parallel and/or non-parallel).

https://kubernetes.io/docs/concepts/workloads/controllers/jobs-run-to-completion/


Or is it up to the user to keep track of which jobs have finished and triggering the next job using a PubSub messaging service?


Solution

  • Overall, no. Check out things like Airflow for this. Job objects give you a pretty simple way to run a container until it completes, that's about it. The parallelism is in that you can run multiple copies, it's not a full workflow management system :)