I am using dagster to running into local node.js microservices pipelines, in order to execute test.
The ide is execute n docker_files, and n node.js microservices, easily like you can do with dagster.
The problem is that when I execute the first second one task a shell command to execute a docker container, dagsteer keep in that point, and not execute all task in the same level.
Current dag logs like this
login aws
|
|
|
v
[docker_elastic, docker_kafka, sleep_10]
|
|
|
v
[node_service_one, node_service_two, node_service__three]
Can I execute at same time all docker_elastic and all node_services?
Is there another easy to configure option to build a local dags easily?
Thanks
If you're using the new job/op APIs, then Dagster will by default use a multiprocess executor, which will be able to run multiple tasks in parallel.
If you're using the pipeline/solid APIs, then you can pass in run configuration to tell Dagster to use the multiprocess executor instead of the default single process executor. If you're launching a pipeline from Dagit, you'd pass in run config that looked like:
execution:
multiprocess: {}
If you're launching these runs from the python apis, then the run config would be:
run_config={"execution": {"multiprocess": {}}}
Note that you'll need to use a multiprocess-compatible IOManager, such as the fs_io_manager (from dagster import fs_io_manager
).
Full docs on multiprocess execution are here: https://docs.dagster.io/0.12.14/concepts/solids-pipelines/pipeline-execution#multiprocessing-execution