Python 在继续之前等待所有多处理作业完成

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/33933561/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-19 14:13:02  来源:igfitidea点击:

Wait for all multiprocessing jobs to finish before continuing

pythonparallel-processingmultiprocessing

提问by Hybrid

I want to run a bunch of jobs in parallel and then continue once all the jobs are finished. I've got something like

我想并行运行一堆作业,然后在所有作业完成后继续。我有类似的东西

# based on example code from https://pymotw.com/2/multiprocessing/basics.html
import multiprocessing
import random
import time

def worker(num):
    """A job that runs for a random amount of time between 5 and 10 seconds."""
    time.sleep(random.randrange(5,11))
    print('Worker:' + str(num) + ' finished')
    return

if __name__ == '__main__':
    jobs = []
    for i in range(5):
        p = multiprocessing.Process(target=worker, args=(i,))
        jobs.append(p)
        p.start()

    # Iterate through the list of jobs and remove one that are finished, checking every second.
    while len(jobs) > 0:
        jobs = [job for job in jobs if job.is_alive()]
        time.sleep(1)

    print('*** All jobs finished ***')

it works, but I'm sure there must be a better way to wait for all the jobs to finish than iterating over them again and again until they are done.

它有效,但我确信必须有更好的方法来等待所有作业完成,而不是一次又一次地迭代它们直到它们完成。

采纳答案by jayant

What about?

关于什么?

for job in jobs:
    job.join()

This blocks until the first process finishes, then the next one and so on. See more about join()

这会阻塞直到第一个进程完成,然后是下一个进程,依此类推。查看更多关于join()

回答by Rbtnk

You can make use of join. It let you wait for another process to end.

您可以使用join。它让您等待另一个进程结束。

t1 = Process(target=f, args=(x,))
t2 = Process(target=f, args=('bob',))

t1.start()
t2.start()

t1.join()
t2.join()

You can also use barrierIt works as for threads, letting you specify a number of process you want to wait on and once this number is reached the barrier free them. Here client and server are asumed to be spawn as Process.

您也可以使用barrier它与线程一样工作,让您指定要等待的进程数量,一旦达到该数量,则无障碍释放它们。这里假设客户端和服务器是作为进程生成的。

b = Barrier(2, timeout=5)

def server():
    start_server()
    b.wait()
    while True:
        connection = accept_connection()
        process_server_connection(connection)

def client():
    b.wait()
    while True:
        connection = make_connection()
        process_client_connection(connection)

And if you want more functionalities like sharing data and more flow control you can use a manager.

如果你想要更多的功能,比如共享数据和更多的流量控制,你可以使用manager