Disable output buffering

339,081

Solution 1

From Magnus Lycka answer on a mailing list:

You can skip buffering for a whole python process using "python -u" (or#!/usr/bin/env python -u etc) or by setting the environment variable PYTHONUNBUFFERED.

You could also replace sys.stdout with some other stream like wrapper which does a flush after every call.

class Unbuffered(object):
   def __init__(self, stream):
       self.stream = stream
   def write(self, data):
       self.stream.write(data)
       self.stream.flush()
   def writelines(self, datas):
       self.stream.writelines(datas)
       self.stream.flush()
   def __getattr__(self, attr):
       return getattr(self.stream, attr)

import sys
sys.stdout = Unbuffered(sys.stdout)
print 'Hello'

Solution 2

I would rather put my answer in How to flush output of print function? or in Python's print function that flushes the buffer when it's called?, but since they were marked as duplicates of this one (what I do not agree), I'll answer it here.

Since Python 3.3, print() supports the keyword argument "flush" (see documentation):

print('Hello World!', flush=True)

Solution 3

# reopen stdout file descriptor with write mode
# and 0 as the buffer size (unbuffered)
import io, os, sys
try:
    # Python 3, open as binary, then wrap in a TextIOWrapper with write-through.
    sys.stdout = io.TextIOWrapper(open(sys.stdout.fileno(), 'wb', 0), write_through=True)
    # If flushing on newlines is sufficient, as of 3.7 you can instead just call:
    # sys.stdout.reconfigure(line_buffering=True)
except TypeError:
    # Python 2
    sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 0)

Credits: "Sebastian", somewhere on the Python mailing list.

Solution 4

Yes, it is.

You can disable it on the commandline with the "-u" switch.

Alternatively, you could call .flush() on sys.stdout on every write (or wrap it with an object that does this automatically)

Solution 5

This relates to Cristóvão D. Sousa's answer, but I couldn't comment yet.

A straight-forward way of using the flush keyword argument of Python 3 in order to always have unbuffered output is:

import functools
print = functools.partial(print, flush=True)

afterwards, print will always flush the output directly (except flush=False is given).

Note, (a) that this answers the question only partially as it doesn't redirect all the output. But I guess print is the most common way for creating output to stdout/stderr in python, so these 2 lines cover probably most of the use cases.

Note (b) that it only works in the module/script where you defined it. This can be good when writing a module as it doesn't mess with the sys.stdout.

Python 2 doesn't provide the flush argument, but you could emulate a Python 3-type print function as described here https://stackoverflow.com/a/27991478/3734258 .

Share:
339,081
Eli Bendersky
Author by

Eli Bendersky

Blog Github

Updated on July 08, 2022

Comments

  • Eli Bendersky
    Eli Bendersky almost 2 years

    Is output buffering enabled by default in Python's interpreter for sys.stdout?

    If the answer is positive, what are all the ways to disable it?

    Suggestions so far:

    1. Use the -u command line switch
    2. Wrap sys.stdout in an object that flushes after every write
    3. Set PYTHONUNBUFFERED env var
    4. sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 0)

    Is there any other way to set some global flag in sys/sys.stdout programmatically during execution?

    • Antti Haapala -- Слава Україні
      Antti Haapala -- Слава Україні over 7 years
      For `print' in Python 3, see this answer.
    • akhan
      akhan over 7 years
      I think a drawback of -u is that it won't work for compiled bytecode or for apps with a __main__.py file as entry point.
    • Beni Cherniavsky-Paskin
      Beni Cherniavsky-Paskin almost 4 years
      The full CPython initialization logic is here: github.com/python/cpython/blob/v3.8.2/Python/…
  • Antti Rasinen
    Antti Rasinen over 15 years
    Original sys.stdout is still available as sys.__stdout__. Just in case you need it =)
  • freespace
    freespace over 15 years
    This might get very confusing when you then later try to capture the output using standard redirection, and find you are capturing nothing! p.s. your stdout is being bolded and stuff.
  • Ryan
    Ryan over 15 years
    This the solution that I used when I ran into problems with print statements being buffered. Worked like a charm.
  • Tobu
    Tobu over 13 years
    There's a windows equivalent: stackoverflow.com/questions/881696/…
  • haridsv
    haridsv over 12 years
    One big caution about selectively printing to stderr is that this causes the lines to appear out of place, so unless you also have timestamp this could get very confusing.
  • apenwarr
    apenwarr about 12 years
    O_SYNC has nothing at all to do with userspace-level buffering that this question is asking about.
  • quantum
    quantum over 11 years
    Are you sure this is not buffered?
  • wim
    wim over 11 years
    #!/usr/bin/env python -u doesn't work!! see here
  • Vladimir Keleshev
    Vladimir Keleshev about 11 years
    __getattr__ just to avoid inheritance?!
  • tzp
    tzp almost 11 years
    Some notes to save some headaches: As I noticed, output buffering works differently depending on if the output goes to a tty or another process/pipe. If it goes to a tty, then it is flushed after each \n, but in a pipe it is buffered. In the latter case you can make use of these flushing solutions. In Cpython (not in pypy!!!): If you iterate over the input with for line in sys.stdin: ... then the for loop will collect a number of lines before the body of the loop is run. This will behave like buffering, though it's rather batching. Instead, do while true: line = sys.stdin.readline()
  • jfs
    jfs almost 11 years
    here's your comment. It might be a bug on older Python versions. Could you provide example code? Something like for line in sys.stdin vs. for line in iter(sys.stdin.readline, "")
  • tzp
    tzp almost 11 years
    for line in sys.stdin: print("Line: " +line); sys.stdout.flush()
  • Will
    Will over 10 years
    So, guys, what are the consequences of disabling output buffering? When would you not want to?
  • Basic
    Basic over 10 years
    @Will That's a whole other question but one major benefit to buffering is performance - writing to a console is not particularly fast, so batching the writes reduces overhead.
  • jfs
    jfs over 10 years
    @tzp: you could use iter() instead of the while loop: for line in iter(pipe.readline, ''):. You don't need it on Python 3 where for line in pipe: yields as soon as possible.
  • meawoppl
    meawoppl over 10 years
    In Python3 you can just override the name of the print function with a flushing one. Its a dirty trick though!
  • leewz
    leewz over 10 years
    Should you check for sys.stdout is sys.__stdout__ instead of relying on the replacement object having a name attribute?
  • Thomas Ahle
    Thomas Ahle about 10 years
    If the old stdout still lives on sys.__stdout__ as some have suggested, the garbage thing won't be necessary, right? It's a cool trick though.
  • Admin
    Admin about 10 years
    Theoretically if within the main script a module was being loaded after stderr has been set, and this loaded module was importing stderr, should it be redirecting to the newly set stderr from the original script?
  • tehwalrus
    tehwalrus over 9 years
    @tzp : The differing behaviour is particularly infuriating when you are using python myscript.py | tee logfile.txt - the purpose being to see what you're doing while also logging it!
  • Michael Clerx
    Michael Clerx about 9 years
    Run that twice and it crashes on windows :-)
  • jfs
    jfs over 8 years
    it looks like the read-ahead bug. It should only happen on Python 2 and if stdin is a pipe. The code in my previous comment demonstrates the issue (for line in sys.stdin provides a delayed response)
  • jfs
    jfs over 8 years
    @meawoppl: you could passflush=True parameter to print() function since Python 3.3.
  • Admin
    Admin over 8 years
    @MichaelClerx Mmm hmm, always remember to close your files xD.
  • Brian Arsuaga
    Brian Arsuaga over 8 years
    this works great if gunicorn isn't respecting PYTHONUNBUFFERED for some reason.
  • Zitrax
    Zitrax over 7 years
    Passing -u on the commandline does not help for me, adding flush=True to the print calls works though (Python 3.5 - Windows).
  • akhan
    akhan over 7 years
    The following was omitted in the copy/paste from the original post: "I don't think it will work in IDLE, since sys.stdout is already replaced with some funny object there which doesn't like to be flushed. (This could be considered a bug in IDLE though.)"
  • jpmc26
    jpmc26 over 7 years
    @Halst No. Using __getattr__ allows it to with with any stream, not just whatever particular class you intend to use.
  • o11c
    o11c about 7 years
    Except that there is no flush kwarg in python2.
  • Tim
    Tim almost 7 years
    @o11c , yes you're right. I was sure I tested it but somehow I was seemingly confused (: I modified my answer, hope it's fine now. Thanks!
  • gbmhunter
    gbmhunter almost 6 years
    As with @Federico's answer, this will not work with Python 3, as it will throw the exception ValueError: can't have unbuffered text I/O when calling print().
  • Don Hatch
    Don Hatch over 5 years
    Your "another possibility" seems at first like the most robust solution, but unfortunately it suffers a race condition in the case that another thread calls open() after your sys.stdout.close() and before your os.dup2(temp_fd, fileno). I found this out when I tried using your technique under ThreadSanitizer, which does exactly that. The failure is made louder by the fact that dup2() fails with EBUSY when it races with open() like that; see stackoverflow.com/questions/23440216/…
  • Mike
    Mike over 5 years
    Editing response to show response is not valid in recent version of python
  • not2qubit
    not2qubit over 5 years
    both os.fdopen(sys.stdout.fileno(), 'wb', 0) (note the b for binary) and flush=True work for me in 3.6.4. However, if you're using subprocess to start another script, make sure you've specified python3, if you have multiple instances of python installed.
  • 0xC0000022L
    0xC0000022L almost 5 years
    Just wondering, but wouldn't that be a perfect use case for functools.partial?
  • MarSoft
    MarSoft over 4 years
    Thanks @0xC0000022L, this makes it look better! print = functools.partial(print, flush=True) works fine for me.
  • Oliver
    Oliver over 4 years
    @0xC0000022L indeed, I have updated the post to show that option, thanks for pointing that out
  • sdbbs
    sdbbs over 4 years
    Python 3.5 on Raspbian 9 gives me OSError: [Errno 29] Illegal seek for the line sys.stdout = os.fdopen(sys.stdout.fileno(), 'a+', buf_arg)
  • Charles Duffy
    Charles Duffy over 4 years
    Line buffering (as -oL enables) is still buffering -- see f/e stackoverflow.com/questions/58416853/…, asking why end='' makes output no longer be immediately displayed.
  • Perkins
    Perkins over 4 years
    If you want that to apply everywhere, import builtins; builtins.print = partial(print, flush=True)
  • Martijn Pieters
    Martijn Pieters over 4 years
    @not2qubit: if you use os.fdopen(sys.stdout.fileno(), 'wb', 0) you end up with a binary file object, not a TextIO stream. You'd have to add a TextIOWrapper to the mix (making sure to enable write_through to eliminate all buffers, or use line_buffering=True to only flush on newlines).
  • Russell Davis
    Russell Davis about 4 years
    If flushing on newlines is sufficient, as of Python 3.7 you can simply call sys.stdout.reconfigure(line_buffering=True)
  • Beni Cherniavsky-Paskin
    Beni Cherniavsky-Paskin almost 4 years
    True, but line buffering is the default (with a tty) so does it make sense to write code assuming output is totally unbuffered — maybe better to explicitly print(..., end='', flush=True) where that's improtant? OTOH, when several programs write to same output concurrently, the trade-off tends to shift from seeing immediate progress to reducing output mixups, and line buffering becomes attractive. So maybe it is better to not write explicit flush and control buffering externally?
  • dyomas
    dyomas almost 4 years
    I think, no. Process itself should decide, when and why it calls flush. External buffering control is compelled workaround here
  • Nico Villanueva
    Nico Villanueva over 3 years
    For more info on streams buffering, give this a read: eklitzke.org/stdout-buffering It mentions the behavior that tzp explains above.
  • not2qubit
    not2qubit over 3 years
    @RussellDavis I did this, but it seem like it doesn't stick to subsequent prints(). I still have to still use print(...,flush=True) on Py3.8. Any ideas?
  • Sergey Nudnov
    Sergey Nudnov about 3 years
    This is extremely useful solution if one hosts a CGI Python script on IIS! Thanks! Along with responseBufferLimit="0" in web.config, this piece of code removes all other buffering artifacts from the script's output
  • truedat101
    truedat101 almost 3 years
    Oddly, this approach worked when nothing else did for Python 3.x, and I am wondering why the other documented approaches (use -u flag) do not work.
  • Princy
    Princy about 2 years
    You can also set buffering=1 instead of 0 for line-buffering.
  • Akaisteph7
    Akaisteph7 almost 2 years
    This is preferable to me. Buffering happens for a reason. Entirely disabling it comes at a cost.