I have some number of jobs that have been submitted to a ProcessPoolExecutor
. I want to track the progress of each job with tqdm
. However, when I run the code below, the progress bars constantly swap places. I believe this is because *handwavey asynchronicity explanation* but I don’t know the method to get around that. I imagine I could theoretically pipeline the progress bar up to the parent process, but I’m not familiar enough with concurrent
or tqdm
to do that. What’s the pythonic answer?
MWE:
from tqdm import tqdm
import time
import concurrent
import random
from concurrent.futures import ProcessPoolExecutor
def main():
pool = ProcessPoolExecutor()
futures = [pool.submit(my_function,x) for x in range(10)]
for future in concurrent.futures.as_completed(futures):
print(future)
def my_function(fn_number):
for i in tqdm(range(500),desc=f"{fn_number}_outer"):
for j in tqdm(range(500),desc=f"{fn_number}_inner"):
# Sometimes the function skips to the outer loop.
# This is foiling using tqdm.update()
if random()>0.9:
break
sleep(0.1)
return fn_number
if __name__=='__main__':
main()