Python multiprocessing - return values from 3 different functions

11,628

There are a few different ways to solve this. The simplest one is to use a multiprocessing.Pool and the apply_async function:

from multiprocessing import Pool

def func1():
    x = 2
    return x

def func2():
    y = 1
    return y

def func3():
    z = 5
    return z

if __name__ == '__main__':
    with Pool(processes=3) as pool:
        r1 = pool.apply_async(func1, ())
        r2 = pool.apply_async(func2, ())
        r3 = pool.apply_async(func3, ())

        print(r1.get(timeout=1))
        print(r2.get(timeout=1))
        print(r3.get(timeout=1))

The multiprocessing.Pool is a rahter helpful construct that takes care of the underlying communication between processes, by setting up pipes and queues and what else is needed. The most common use case is to use it together with different data to the same function (distributing the work) using the .map function. However, it can also be used for different functions, by e.g. the .apply_async construct like I am doing here.

This, however, does not work from the interpreter but must be stored as as .py file and run using python filename.py.

Share:
11,628
Fab
Author by

Fab

Updated on June 05, 2022

Comments

  • Fab
    Fab almost 2 years

    I need to execute 3 functions in parallel and retrieve a value from each of them.

    Here my code:

    def func1():
       ...
        return x
    
    def func2():
       ...
        return y
    
    def func3():
       ...
        return z
    
    
    p1 = Process(target=func1)
    first = p1.start()
    p2 = Process(target=func2)
    second= p2.start()
    p3 = Process(target=func3)
    third = p3.start()
    p1.join()
    p2.join()
    p3.join()
    

    but first, second and third seems to be 'NoneType' objects.

    What's wrong in my code?

    Thanks

  • Fab
    Fab about 6 years
    Functions seems to not run in parallel.....
  • Fab
    Fab about 6 years
    Is there a way to add a 'barrier'? That is, the master process should continue after last slave process ended
  • JohanL
    JohanL about 6 years
    Well, yes, you can block hard by not setting a timeout value in your .get() calls. Also, if you have a timeout the get call can fail, with a TimeoutError which really needs to be handled (even though I did not).
  • Reck
    Reck about 6 years
    This is a basic example of data parallelism using pool. You may want to check the documentation once.
  • JohanL
    JohanL about 6 years
    No, they do not run in parallel. As stated in the documentation: apply(func[, args[, kwds]]) Equivalent of the apply() built-in function. It blocks until the result is ready, so apply_async() is better suited for performing work in parallel. Additionally, func is only executed in one of the workers of the pool.
  • Reck
    Reck about 6 years
    Hmm, interesting, makes sense! Then I feel @JohanL solution is more suitable. And yeah async func call in multiprocess is better than blocking it till each func returns the result which is in the case of Pool.apply.