I’m trying to run a function with submitit on a SLURM cluster.
The function uses ProcessPoolExecutor:
def func():
print("Hello world!")
def parallel_executor():
with ProcessPoolExecutor(max_workers=24) as executor_inner:
j = executor_inner.submit(func)
j.result()
I’m then submitting the function parallel_executor with submitit to the SLURM cluster:
executor = submitit.AutoExecutor()
j = executor.submit(parallel_executor)
But I get the following error:
Traceback (most recent call last):
File "/home/X/venvs/smac_env/lib/python3.10/site-packages/submitit/core/submission.py", line 55, in process_job
result = delayed.result()
File "/home/X/venvs/smac_env/lib/python3.10/site-packages/submitit/core/utils.py", line 133, in result
self._result = self.function(*self.args, **self.kwargs)
File "/home/X/meta_hpo/test_submitit.py", line 33, in parallel_executor
j.result()
File "X/Python/3.10.8-GCCcore-12.2.0/lib/python3.10/concurrent/futures/_base.py", line 458, in result
return self.__get_result()
File "X/Python/3.10.8-GCCcore-12.2.0/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
raise self._exception
File "X/Python/3.10.8-GCCcore-12.2.0/lib/python3.10/multiprocessing/queues.py", line 244, in _feed
obj = _ForkingPickler.dumps(obj)
File "X/Python/3.10.8-GCCcore-12.2.0/lib/python3.10/multiprocessing/reduction.py", line 51, in dumps
cls(buf, protocol).dump(obj)
_pickle.PicklingError: Can't pickle <function func at 0x151b9dbed6c0>: attribute lookup func on __main__ failed
Is there a fix for this issue or another way to run a function that has multiple “sub-tasks”?
1