I have thousands of Python files to run PyType on, which will take months. Thus, I am trying to use Python multiprocessing to create multiple processes and use each process to run PyType on a separate file to speed up the running.
I’ve tried the following script, and I expect there to be N processes, each running PyType on one different file. However, I found that all processes run PyType on the same file instead of a different file.
I suspect that it is a problem with PyType’s internal working mechanism, but, has anyone encountered similar problems?
import multiprocessing
import subprocess
import os
def work(cmd):
result = subprocess.run(cmd, capture_output=True, text=True)
return f"{os.getpid()} - Command: {cmd}nOutput: {result.stdout}nError: {result.stderr}"
if __name__ == '__main__':
my_dir = "xxx"
cmds = []
for file in os.listdir(my_dir):
if file.endswith(".py"):
cmds.append(["pytype", "-o", os.path.join(my_dir, '.pytype'), os.path.join(my_dir, file)])
count = multiprocessing.cpu_count()
with multiprocessing.Pool(processes=count) as pool:
results = pool.map(work, cmds)
for result in results:
print(result)
user12969697 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.