Python multiprocessing - return values from 3 different functions
There are a few different ways to solve this. The simplest one is to use a multiprocessing.Pool
and the apply_async
function:
from multiprocessing import Pool
def func1():
x = 2
return x
def func2():
y = 1
return y
def func3():
z = 5
return z
if __name__ == '__main__':
with Pool(processes=3) as pool:
r1 = pool.apply_async(func1, ())
r2 = pool.apply_async(func2, ())
r3 = pool.apply_async(func3, ())
print(r1.get(timeout=1))
print(r2.get(timeout=1))
print(r3.get(timeout=1))
The multiprocessing.Pool
is a rahter helpful construct that takes care of the underlying communication between processes, by setting up pipes and queues and what else is needed. The most common use case is to use it together with different data to the same function (distributing the work) using the .map
function. However, it can also be used for different functions, by e.g. the .apply_async
construct like I am doing here.
This, however, does not work from the interpreter but must be stored as as .py
file and run using python filename.py
.
Fab
Updated on June 05, 2022Comments
-
Fab almost 2 years
I need to execute 3 functions in parallel and retrieve a value from each of them.
Here my code:
def func1(): ... return x def func2(): ... return y def func3(): ... return z p1 = Process(target=func1) first = p1.start() p2 = Process(target=func2) second= p2.start() p3 = Process(target=func3) third = p3.start() p1.join() p2.join() p3.join()
but first, second and third seems to be 'NoneType' objects.
What's wrong in my code?
Thanks
-
Fab about 6 yearsFunctions seems to not run in parallel.....
-
Fab about 6 yearsIs there a way to add a 'barrier'? That is, the master process should continue after last slave process ended
-
JohanL about 6 yearsWell, yes, you can block hard by not setting a
timeout
value in your.get()
calls. Also, if you have atimeout
the get call can fail, with aTimeoutError
which really needs to be handled (even though I did not). -
Reck about 6 yearsThis is a basic example of data parallelism using pool. You may want to check the documentation once.
-
JohanL about 6 yearsNo, they do not run in parallel. As stated in the documentation: apply(func[, args[, kwds]]) Equivalent of the apply() built-in function. It blocks until the result is ready, so apply_async() is better suited for performing work in parallel. Additionally, func is only executed in one of the workers of the pool.
-
Reck about 6 yearsHmm, interesting, makes sense! Then I feel @JohanL solution is more suitable. And yeah async func call in multiprocess is better than blocking it till each func returns the result which is in the case of Pool.apply.