Dask: Jobs On Multiple Nodes With One Worker, Run On One Node Only
I am trying to process some files using a python function and would like to parallelize the task on a PBS cluster using dask. On the cluster I can only launch one job but have acce
Solution 1:
cluster = PBSCluster(cores=240,
memory="1GB",
The values you give to the Dask Jobqueue constructors are the values for a single job for a single node. So here you are asking for a node with 240 cores, which probably doesn't make sense today.
If you can only launch one job then dask-jobqueue's model probably won't work for you. I recommnd looking at dask-mpi as an alternative.
Post a Comment for "Dask: Jobs On Multiple Nodes With One Worker, Run On One Node Only"