Check on the stdout of a running subprocess in python Check on the stdout of a running subprocess in python multithreading multithreading

Check on the stdout of a running subprocess in python


Your second attempt is 90% correct. The only issue is that you are attempting to read all of tail's stdout at the same time once it's finished. However, tail is intended to run (indefinitely?) in the background, so you really want to read stdout from it line-by-line:

from subprocess import Popen, PIPE, STDOUTp = Popen(["tail", "-f", "/tmp/file"], stdin=PIPE, stdout=PIPE, stderr=STDOUT)for line in p.stdout:    print(line)

I have removed the shell=True and close_fds=True arguments. The first is unnecessary and potentially dangerous, while the second is just the default.

Remember that file objects are iterable over their lines in Python. The for loop will run until tail dies, but it will process each line as it appears, as opposed to read, which will block until tail dies.

If I create an empty file in /tmp/file, start this program and begin echoing lines into the file using another shell, the program will echo those lines. You should probably replace print with something a bit more useful.

Here is an example of commands I typed after starting the code above:

Command line

$ echo a > /tmp/file$ echo b > /tmp/file$ echo c >> /tmp/file

Program Output (From Python in a different shell)

b'a\n'b'tail: /tmp/file: file truncated\n'b'b\n'b'c\n'

In the case that you want your main program be responsive while you respond to the output of tail, start the loop in a separate thread. You should make this thread a daemon so that it does not prevent your program from exiting even if tail is not finished. You can have the thread open the sub-process or you can just pass in the standard output to it. I prefer the latter approach since it gives you more control in the main thread:

def deal_with_stdout():    for line in p.stdout:        print(line)from subprocess import Popen, PIPE, STDOUTfrom threading import Threadp = Popen(["tail", "-f", "/tmp/file"], stdin=PIPE, stdout=PIPE, stderr=STDOUT)t = Thread(target=deal_with_stdout, daemon=True)t.start()t.join()

The code here is nearly identical, with the addition of a new thread. I added a join() at the end so the program would behave well as an example (join waits for the thread to die before returning). You probably want to replace that with whatever processing code you would normally be running.

If your thread is complex enough, you may also want to inherit from Thread and override the run method instead of passing in a simple target.