How to kill all Pool workers in multiprocess?

22,823

Solution 1

This isn't working the way you're intending because calling sys.exit() in a worker process will only terminate the worker. It has no effect on the parent process or the other workers, because they're separate processes and raising SystemExit only affects the current process. You need to send a signal back the parent process to tell it that it should shut down. One way to do this for your use-case would be to use an Event created in a multiprocessing.Manager server:

import multiprocessing

def myfunction(i, event):
    if not event.is_set():
        print i 
    if i == 20:
        event.set()

if __name__ == "__main__":
    p= multiprocessing.Pool(10) 
    m = multiprocessing.Manager()
    event = m.Event()
    for i in range(100):
        p.apply_async(myfunction , (i, event))
    p.close()

    event.wait()  # We'll block here until a worker calls `event.set()`
    p.terminate() # Terminate all processes in the Pool

Output:

0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20

As pointed out in Luke's answer, there is a race here: There's no guarantee that all the workers will run in order, so it's possible that myfunction(20, ..) will run prior to myfuntion(19, ..), for example. It's also possible that other workers after 20 will run before the main process can act on the event being set. I reduced the size of the race window by adding the if not event.is_set(): call prior to printing i, but it still exists.

Solution 2

You can't do this.

Even if you were able to end all of your processes when i == 20, you couldn't be sure that only 20 numbers were printed because your processes will execute in a non-deterministic order.

If you want to only run 20 processes, then you need to manage this from your master process (ie. your control loop).

Share:
22,823
N3TC4T
Author by

N3TC4T

Talk is cheap. Show me the code @torvalds

Updated on November 03, 2020

Comments

  • N3TC4T
    N3TC4T over 3 years

    I want to stop all threads from a single worker.

    I have a thread pool with 10 workers:

    def myfunction(i):
        print(i) 
        if (i == 20):
            sys.exit()
    
    p = multiprocessing.Pool(10, init_worker) 
    
    for i in range(100):
        p.apply_async(myfunction, (i,))
    

    My program does not stop and the other processes continue working until all 100 iterations are complete. I want to stop the pool entirely from inside the thread that calls sys.exit(). The way it is currently written will only stop the worker that calls sys.exit().

  • dano
    dano over 6 years
    @RajanChaudan A typo! I've fixed it now.