Search code examples
pythonstdoutstderrfile-writing

Python auto-flush stdout and stderr to local text file at runtime


I have a Python script to continuously monitor various sensors. Periodically, my Python script uses print() statements.

My goal is to have my Python script send stdout and stderr sent to a file, in real-time.

My current solution is based off of the following two answers:

Here is my current code:

from contextlib import redirect_stdout, redirect_stderr
from os.path import join
from some_other_package import get_base_logging_path  # returns filepath (str)

def main():
    stdout_log_path = join(get_base_logging_path(), "stdout.txt")
    stderr_log_path = join(get_base_logging_path(), "stderr.txt")
    with open(stdout_log_path, 'w') as f1, redirect_stdout(f1), \
        open(stderr_log_path, 'w') as f2, redirect_stderr(f2):
            do_stuff()

Here is where I come to Stack Overflow: the files stdout.txt and stderr.txt are only modified when the script either errors out or I safely close it with control-C.

I would like (particularly stdout.txt) to be updated every time a new print() statement is executed in my Python code. How can I have my program auto-flush stdout and stderr to the text file?

Note: I tried python -u and it doesn't seem to work.


Solution

  • Setting the buffersize of the file to 1 worked for me

    from contextlib import redirect_stdout
    import time
    import sys
    
    with open("foo.txt", "w", buffering=1) as handle:
        with redirect_stdout(handle):
            print("foo")
            time.sleep(30)
            print("bar")
    

    what also worked is using print with flush keyword

    print("foobar", flush=True)
    

    but the latter may not be used if you cannot modify the print-statements