You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
Our current Slurm scripts are a combination of 2 bash scripts that might be difficult to understand and customize in other user environments since it has some assumptions baked in (enroot/pyxis for containers), specific cluster setup etc.
There have been advancements made to Dask job queue's slurm runner which should make it easier to launch multi-node jobs in a similar environment to what we do.
In theory it should make it easier to launch MN slurm jobs with all the setup info shared as a part of the runner API.
It could be worth exploring if this makes our multi-node slurm setup a bit simpler.
ayushdg
changed the title
Explore Dask jonque's slurm runner for multi node slurm setups.
Explore Dask jobque's slurm runner for multi node slurm setups.
Aug 23, 2024
Is your feature request related to a problem? Please describe.
Our current Slurm scripts are a combination of 2 bash scripts that might be difficult to understand and customize in other user environments since it has some assumptions baked in (enroot/pyxis for containers), specific cluster setup etc.
There have been advancements made to Dask job queue's slurm runner which should make it easier to launch multi-node jobs in a similar environment to what we do.
In theory it should make it easier to launch MN slurm jobs with all the setup info shared as a part of the runner API.
It could be worth exploring if this makes our multi-node slurm setup a bit simpler.
Thanks @jacobtomlinson for the suggestion!
The text was updated successfully, but these errors were encountered: