python: nonblocking subprocess, check stdout

Posted by Will Cavanagh on Stack Overflow See other posts from Stack Overflow or by Will Cavanagh
Published on 2011-01-03T15:23:25Z Indexed on 2011/01/03 15:54 UTC
Read the original article Hit count: 414

Filed under:
|
|
|
|

Ok so the problem I'm trying to solve is this:

I need to run a program with some flags set, check on its progress and report back to a server. So I need my script to avoid blocking while the program executes, but I also need to be able to read the output. Unfortunately, I don't think any of the methods available from Popen will read the output without blocking. I tried the following, which is a bit hack-y (are we allowed to read and write to the same file from two different objects?)

import time
import subprocess
from subprocess import *
with open("stdout.txt", "wb") as outf:
    with open("stderr.txt", "wb") as errf:
        command = ['Path\\To\\Program.exe', 'para', 'met', 'ers']
        p = subprocess.Popen(command, stdout=outf, stderr=errf)
        isdone = False
        while not isdone :
            with open("stdout.txt", "rb") as readoutf: #this feels wrong
                for line in readoutf:
                    print(line)
            print("waiting...\\r\\n")
            if(p.poll() != None) :
                done = True
            time.sleep(1)
        output = p.communicate()[0]    
        print(output)

Unfortunately, Popen doesn't seem to write to my file until after the command terminates.

Does anyone know of a way to do this? I'm not dedicated to using python, but I do need to send POST requests to a server in the same script, so python seemed like an easier choice than, say, shell scripting.

Thanks! Will

© Stack Overflow or respective owner

Related posts about python

Related posts about subprocess