I tried to enforce Snakemake to run a rule (with many jobs) sequentially to avoid memory conflict.
rule run_eval_all:
input:
expand("config["out_model"] + "{iLogit}.rds", iLogit = MODELS)
rule eval_model:
input:
script = config["src_est"] + "evals/script.R",
model = config["out_model"] + "{iLogit}.rds",
output:
"out/{iLogit}.rds"
threads: 5
resources:
mem_mb = 100000
shell:
"{runR} {input.script} "
"--out {output}"
And I run the rule by snakemake --cores all --resources mem_mb=100000 run_eval_all
. But I keep getting errors like:
x86_64-conda-linux-gnu % snakemake --resources mem_mb=100000 run_eval_all
Traceback (most recent call last):
File "/local/home/zhakaida/mambaforge/envs/r_snake/bin/snakemake", line 10, in <module>
sys.exit(main())
File "/local/home/zhakaida/mambaforge/envs/r_snake/lib/python3.9/site-packages/snakemake/__init__.py", line 2401, in main
resources = parse_resources(args.resources)
File "/local/home/zhakaida/mambaforge/envs/r_snake/lib/python3.9/site-packages/snakemake/resources.py", line 85, in parse_resources
for res, val in resources_args.items():
AttributeError: 'list' object has no attribute 'items'
If I run snakemake --cores all run_eval_all
, it works but jobs run in parallel (as expected) and sometimes induces memory overuse and collapse. How shall I properly claim memory for Snakemake?
The error is due to a known issue with parsing the --resources
argument in Snakemake 6.5.1, https://github.com/snakemake/snakemake/issues/1069.
Update to snakemake 6.5.3 or later and see if your problem still exists.