Skip to content

Commit

Permalink
change conda-envs logic for dag
Browse files Browse the repository at this point in the history
  • Loading branch information
akhil-elucidata committed Jun 14, 2024
1 parent f49d299 commit 6da05ab
Showing 1 changed file with 17 additions and 0 deletions.
17 changes: 17 additions & 0 deletions snakemake/dag.py
Original file line number Diff line number Diff line change
Expand Up @@ -326,6 +326,23 @@ def update_conda_envs(self):
not in self.workflow.storage_settings.shared_fs_usage
)
)
# DAG-jobs may have jobs of rules which are not part of
# DAG-rules. Such jobs do not actually execute, even tho
# they are part of the DAG. Hence, Do NOT consider them
# for conda-env deployment.
# This is helpful in case of cluster/cloud execution
# where snakemake uses target-job & allow-rules settings
# for submitting an individual job to the cluster, and
# only target-job's conda-env is needed & should be created
# on the machine on which the job runs.
# Example -
# Target-job '3' may need inputs from jobs '1' & '2'
# and hence DAG-jobs will contain all of these three jobs.
# However, if allowed-rules is set to ['r3'] corresponding
# to job '3', then only job '3' will actually execute.
# So we only need to consider conda-env for job '3', even
# tho DAG-jobs contain jobs ['1', '2', '3'].
and job.rule in self.rules
}

# Then based on md5sum values
Expand Down

0 comments on commit 6da05ab

Please sign in to comment.