Run subprocess and print output to logging
Solution 1
I am sure that there is the way to do it without creating temporary file to store process output
You simply have to check for the documentation of Popen
, in particular about stdout
and stderr
:
stdin
,stdout
andstderr
specify the executed program’s standard input, standard output and standard error file handles, respectively. Valid values arePIPE
, an existing file descriptor (a positive integer), an existing file object, andNone
.PIPE
indicates that a new pipe to the child should be created. With the default settings ofNone
, no redirection will occur; the child’s file handles will be inherited from the parent. Additionally,stderr
can beSTDOUT
, which indicates that thestderr
data from the child process should be captured into the same file handle as forstdout
.
So you can see that you can either use a file object, or the PIPE
value. This allows you to use the communicate()
method to retrieve the output:
from StringIO import StringIO
process = subprocess.Popen(arguments, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
output, error = process.communicate()
log_subprocess_output(StringIO(output))
I'd rewrite your code as:
import shlex
import logging
import subprocess
from StringIO import StringIO
def run_shell_command(command_line):
command_line_args = shlex.split(command_line)
logging.info('Subprocess: "' + command_line + '"')
try:
command_line_process = subprocess.Popen(
command_line_args,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
)
process_output, _ = command_line_process.communicate()
# process_output is now a string, not a file,
# you may want to do:
# process_output = StringIO(process_output)
log_subprocess_output(process_output)
except (OSError, CalledProcessError) as exception:
logging.info('Exception occured: ' + str(exception))
logging.info('Subprocess failed')
return False
else:
# no exception was raised
logging.info('Subprocess finished')
return True
Solution 2
You could try to pass the pipe directly without buffering the whole subprocess output in memory:
from subprocess import Popen, PIPE, STDOUT
process = Popen(command_line_args, stdout=PIPE, stderr=STDOUT)
with process.stdout:
log_subprocess_output(process.stdout)
exitcode = process.wait() # 0 means success
where log_subprocess_output()
could look like:
def log_subprocess_output(pipe):
for line in iter(pipe.readline, b''): # b'\n'-separated lines
logging.info('got line from subprocess: %r', line)
Solution 3
I was trying to achieve the same on check_call
and check_ouput
. I found this solution to be working.
import logging
import threading
import os
import subprocess
logging.basicConfig(format='%(levelname)s:%(message)s', level=logging.INFO)
class LogPipe(threading.Thread):
def __init__(self, level):
"""Setup the object with a logger and a loglevel
and start the thread
"""
threading.Thread.__init__(self)
self.daemon = False
self.level = level
self.fdRead, self.fdWrite = os.pipe()
self.pipeReader = os.fdopen(self.fdRead)
self.start()
def fileno(self):
"""Return the write file descriptor of the pipe"""
return self.fdWrite
def run(self):
"""Run the thread, logging everything."""
for line in iter(self.pipeReader.readline, ''):
logging.log(self.level, line.strip('\n'))
self.pipeReader.close()
def close(self):
"""Close the write end of the pipe."""
os.close(self.fdWrite)
def write(self):
"""If your code has something like sys.stdout.write"""
logging.log(self.level, message)
def flush(self):
"""If you code has something like this sys.stdout.flush"""
pass
After implementing it, I performed the below steps:
try:
# It works on multiple handlers as well
logging.basicConfig(handlers=[logging.FileHandler(log_file), logging.StreamHandler()])
sys.stdout = LogPipe(logging.INFO)
sys.stderr = LogPipe(logging.ERROR)
...
subprocess.check_call(subprocess_cmd, stdout=sys.stdout, stderr=sys.stderr)
export_output = subprocess.check_output(subprocess_cmd, stderr=sys.stderr)
...
finally:
sys.stdout.close()
sys.stderr.close()
# It is neccessary to close the file handlers properly.
sys.stdout = sys.__stdout__
sys.stderr = sys.__stderr__
logging.shutdown()
os.remove(log_file)
Solution 4
This worked for me:
from subprocess import Popen, PIPE, STDOUT
command = f"shell command with arguments"
process = Popen(command, shell=True, stdout=PIPE, stderr=STDOUT)
with process.stdout:
for line in iter(process.stdout.readline, b''):
print(line.decode("utf-8").strip())
With exception handling:
from subprocess import Popen, PIPE, STDOUT, CalledProcessError
command = f"shell command with arguments"
process = Popen(command, shell=True, stdout=PIPE, stderr=STDOUT)
with process.stdout:
try:
for line in iter(process.stdout.readline, b''):
print(line.decode("utf-8").strip())
except CalledProcessError as e:
print(f"{str(e)}")
Kostya
Updated on October 08, 2021Comments
-
Kostya over 2 years
I am looking for the way to call shell scripts from python and write their stdout and stderr to file using logging. Here is my code:
import logging import tempfile import shlex import os def run_shell_command(command_line): command_line_args = shlex.split(command_line) logging.info('Subprocess: \"' + command_line + '\"') process_succeeded = True try: process_output_filename = tempfile.mktemp(suffix = 'subprocess_tmp_file_') process_output = open(process_output_filename, 'w') command_line_process = subprocess.Popen(command_line_args,\ stdout = process_output,\ stderr = process_output) command_line_process.wait() process_output.close() process_output = open(process_output_filename, 'r') log_subprocess_output(process_output) process_output.close() os.remove(process_output_filename) except: exception = sys.exc_info()[1] logging.info('Exception occured: ' + str(exception)) process_succeeded = False if process_succeeded: logging.info('Subprocess finished') else: logging.info('Subprocess failed') return process_succeeded
And I am sure that there is the way to do it without creating temporary file to store process output. Any ideas?
-
jfs over 10 yearsIf you don't need to seek inside the file; the pipe could be used as is to log subprocess' output
-
0 _ almost 9 yearsif the output is expected to be large, then this can hand the subprocess: thraxil.org/users/anders/posts/2008/03/13/…
-
jfs almost 9 years@IoannisFilippidis: the code in the answer works for any output no matter how large it is. The article that you've linked does not apply here.
-
knoight about 8 years@J.F.Sebastian - curious is there a need to worry about
p.wait()
? Docs have a warning about Popen.wait which seems to relate to the issue @IoannisFilippidis brought up. -
jfs about 8 years@knoight: it is ok to call it after I/O is done (note:
p.stdout.close()
). Again, the blog post does not apply to the code in the answer (it would apply if you move (DO NOT DO IT!!!)p.wait()
beforelog_subprocess_ouput()
call in this case). -
moin moin over 7 yearsnot sure why but this code does not work. I have "cmd = 'ls';p = Popen(... log.info(p.stdout)" but still does not work. exitcode is always 2
-
jfs over 7 years@mark create a complete but minimal code example (show the exact code that you use). Mention OS you use (Windows, macOS, etc), python version (
python -V
). And post it as a new question. -
moin moin over 7 years@J.F.Sebastian sure, I can post my OS version and other details... but did you consider that the code you provide in your answer might be inadequate? I do not see how opening a new question helps with that. like for example where does p come from?
-
moin moin over 7 yearsfor those who run into the same snafu here is a working solution: stackoverflow.com/questions/18774476/…
-
jfs over 7 years@mark: you are right about
p
name. It meantcommand_line_process
which is too long for a small code example, I've renamed both toprocess
¶ Your codelog.info(p.stdout)
does not make sense (it was not from my answer) that is why I've suggested a new question with a complete code example (there could be other errors in your code)¶ Your link accumulates the whole subprocess output in memory before logging anything. The code in my answer allows to log while the child process is still running without exhausting all available memory. -
moin moin over 7 yearsNow it is easier to follow for me. thank you. The memory aspect is very interesting to me. If I understand correctly, you propose a trick to pipe the errors onto stdout. Consequently errors are logged as info but it is more efficient in terms of memory. Is it possible to log errors as errors and be memory efficient, too?
-
jfs over 7 years@mark if you are about
stderr=STDOUT
then it is one of the assumptions in the question (notice that both stdout, stderr are redirected to the sameprocess_output
file). -
Cecil Curry about 7 years@J.F.Sebastian Actually, mark's query is of interest to me too. Arguably, it should be reposted as a proper question. But we're all lazy here. For the sake of argument, is logging stdout with logging level
INFO
and stderr with logging levelERROR
in realtime (i.e., without buffering) even feasible? If so, my working assumption is that multiple non-trivial coroutines would be required. Which really sounds like a lot of work. -
Cecil Curry about 7 yearsAh-ha! You've already answered a similar question with a relevant multithreading solution. Since the
logging
module is thread-safe by design, refactoring that solution to log rather than queue process output should be fairly straightforward. Thanks for the intrepid insights. You are a wizard among men. -
alpha_989 about 6 yearsWhat does the
log_suprocess_output
look like for your code sample? -
Isaac almost 6 yearsThat results in the entire
stdout
/stderr
content to reside in memory as strings, which may not be ideal for processes that do a lot of writing tostdout
/stderr
. -
DUDANF over 3 yearsSir you are an absolute genius. Thank you so much for this lovely piece of code. I wish I could upvote twice!
-
thanos.a over 2 yearsthe terminal prints out b'my message'. How can I get rid of the b'' ?
-
jfs over 2 years@thanos.a: Convert bytes to a string
-
alexis_thual about 2 years@jfs is there a simple way to distinguish between stdout and stderr with this method? This would for instance allow to log stdout with
logging.info()
and stderr withlogging.warning()
-
jfs about 2 years@alexis_thual
stderr=STDOUT
above merges stdout/stderr into a single stream, thus at thelog_subprocess_output
they are the same. To get them separately, you have to read them separately. If the output is small, then the simple option is to read all at once:p = subprocess.run(cmd, capture_output=True)
(usep.stdout
,p.stderr
strings, to get results) -
alexis_thual about 2 years@jfs It took me some time to figure out how to distinguish between
stderr
andstdout
and still usewith
to enjoy live printing. Eventually, I keptPopen
, sent bothstderr
andstdout
toPIPE
, added a new kwargerr
tolog_subprocess_output()
which enables choosing betweenlogging.warning
andlogging.info
, and had onewith
forstdout
andstderr
respectively callinglog_subprocess_output()
witherr=False
anderr=True
. Hope that helps others! -
jfs about 2 years@alexis_thual no, it won't work for unlimited output. Your code may dead lock eventually when OS pipe buffer is full. To avoid it, you have to read concurrently from both stdout and stderr at the same time (requires async. I/O (e.g., threads or asyncio).