Is it possible to run function in a subprocess without threading or writing a separate file/script.

111,364

Solution 1

I think you're looking for something more like the multiprocessing module:

http://docs.python.org/library/multiprocessing.html#the-process-class

The subprocess module is for spawning processes and doing things with their input/output - not for running functions.

Here is a multiprocessing version of your code:

from multiprocessing import Process, Queue

# must be a global function    
def my_function(q, x):
    q.put(x + 100)

if __name__ == '__main__':
    queue = Queue()
    p = Process(target=my_function, args=(queue, 1))
    p.start()
    p.join() # this blocks until the process terminates
    result = queue.get()
    print result

Solution 2

You can use the standard Unix fork system call, as os.fork(). fork() will create a new process, with the same script running. In the new process, it will return 0, while in the old process it will return the process ID of the new process.

child_pid = os.fork()
if child_pid == 0:
  print "New proc"
else:
  print "Old proc"

For a higher level library, that provides multiprocessing support that provides a portable abstraction for using multiple processes, there's the multiprocessing module. There's an article on IBM DeveloperWorks, Multiprocessing with Python, with a brief introduction to both techniques.

Share:
111,364
wroscoe
Author by

wroscoe

Updated on January 08, 2021

Comments

  • wroscoe
    wroscoe over 3 years
    import subprocess
    
    def my_function(x):
        return x + 100
    
    output = subprocess.Popen(my_function, 1) #I would like to pass the function object and its arguments
    print output 
    #desired output: 101
    

    I have only found documentation on opening subprocesses using separate scripts. Does anyone know how to pass function objects or even an easy way to pass function code?

  • Brian Campbell
    Brian Campbell over 14 years
    I'm curious; why the downvote? Is there anything wrong in my answer?
  • Devin Jeanpierre
    Devin Jeanpierre over 14 years
    Multiprocessing is not just a higher level wrapper around fork(), it's a multiplatform multiprocessing toolkit (which uses fork on unix). Which is important, because this means it runs on, say, Windows, while fork() does not. Edit: And this was the reason for the downvote, although I later decided it probably wasn't worth it. Too late to take it back, though. Edit2: Or rather, fork() being suggested when it's not cross-platform was the reason.
  • Alex Martelli
    Alex Martelli over 14 years
    @Devin, you can always take back a downvote you did, if you want to.
  • Brian Campbell
    Brian Campbell over 14 years
    Edited to clarify that, then. I explicitly mentioned that fork is not portable; I generally will give non-portable answers along with information that they are non-portable, and let the questioner decide if that's sufficient for them. As I've edited my answer, you should be able to remove the downvote if you feel that I've improved it sufficiently; though no hard feelings if you don't, I just wanted to check to see what I'd gotten wrong.
  • Devin Jeanpierre
    Devin Jeanpierre over 14 years
    @Alex, nope, you can't. After a certain amount of time passes, you can't take it back, until an edit occurs. Such an amount of time had passed before I rethought, thus the "too late" comment. Anyway, as I said, I had decided it wasn't worth it, so it's gone. I also do appreciate and understand your reasons, and I'm glad there'd be no hard feelings either way. :p
  • schlamar
    schlamar about 12 years
    You can use the processify decorator as a shortcut: gist.github.com/2311116
  • Jens
    Jens about 9 years
    I assume that this clones the Python interpreter and all of its environment for the subprocess?
  • Stuart Axon
    Stuart Axon about 8 years
    Here is a fork of processify that works in python 3 and supports generator functions. gist.github.com/stuaxo/889db016e51264581b50
  • Petr Baudis
    Petr Baudis about 7 years
    Note that this code contains a deadlock in case you are passing non-trivially large data through the queue - always queue.get() before joining the process, otherwise it'll hang on trying to write to the queue while nothing is reading it.
  • Amir
    Amir about 6 years
    @schlamar I want to run a function in the background but I have some resource limitations and cannot run the function as many times that I want and want to queue the extra executions of the function. Do you have any idea on how I should do that? I have my question here. Could you please take a look at my question? Any help would be great!
  • Amir
    Amir about 6 years
    @BrianCampbell I want to run a function in the background but I have some resource limitations and cannot run the function as many times that I want and want to queue the extra executions of the function. Do you have any idea on how I should do that? I have my question here. Could you please take a look at my question and see if you can give me some hints (or even better, an answer) on how I should do that?