Python 运行子进程并将输出打印到日志记录

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/21953835/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-18 23:56:57  来源:igfitidea点击:

Run subprocess and print output to logging

pythonloggingsubprocess

提问by Kostya Bazhanov

I am looking for the way to call shell scripts from python and write their stdout and stderr to file using logging. Here is my code:

我正在寻找从 python 调用 shell 脚本并使用日志记录将它们的 stdout 和 stderr 写入文件的方法。这是我的代码:

import logging
import tempfile
import shlex
import os

def run_shell_command(command_line):
    command_line_args = shlex.split(command_line)

    logging.info('Subprocess: \"' + command_line + '\"')

    process_succeeded = True
    try:
        process_output_filename = tempfile.mktemp(suffix = 'subprocess_tmp_file_')
        process_output = open(process_output_filename, 'w')

        command_line_process = subprocess.Popen(command_line_args,\
                                                stdout = process_output,\
                                                stderr = process_output)
        command_line_process.wait()
        process_output.close()

        process_output = open(process_output_filename, 'r')
        log_subprocess_output(process_output)
        process_output.close()

        os.remove(process_output_filename)
    except:
        exception = sys.exc_info()[1]
        logging.info('Exception occured: ' + str(exception))
        process_succeeded = False

    if process_succeeded:
        logging.info('Subprocess finished')
    else:
        logging.info('Subprocess failed')

    return process_succeeded

And I am sure that there is the way to do it without creating temporary file to store process output. Any ideas?

而且我确信有一种方法可以在不创建临时文件来存储进程输出的情况下做到这一点。有任何想法吗?

采纳答案by Bakuriu

I am sure that there is the way to do it without creating temporary file to store process output

我确信有没有创建临时文件来存储进程输出的方法

You simply have to check for the documentation of Popen, in particular about stdoutand stderr:

您只需检查 的文档Popen,特别是关于stdout和的文档stderr

stdin, stdoutand stderrspecify the executed program's standard input, standard output and standard error file handles, respectively. Valid values are PIPE, an existing file descriptor (a positive integer), an existing file object, and None. PIPEindicates that a new pipe to the child should be created. With the default settings of None, no redirection will occur; the child's file handles will be inherited from the parent. Additionally, stderrcan be STDOUT, which indicates that the stderrdata from the child process should be captured into the same file handle as for stdout.

stdinstdout并分别stderr指定执行程序的标准输入、标准输出和标准错误文件句柄。有效值为PIPE、现有文件描述符(正整数)、现有文件对象和None. PIPE表示应该创建一个到孩子的新管道。使用默认设置 None,不会发生重定向;子级的文件句柄将从父级继承。此外,stderrcanSTDOUT表示stderr子进程的数据应该被捕获到与 for 相同的文件句柄中stdout

So you can see that you can either use a file object, or the PIPEvalue. This allows you to use the communicate()method to retrieve the output:

所以你可以看到你可以使用文件对象或PIPE值。这允许您使用该communicate()方法来检索输出:

from StringIO import StringIO
process = subprocess.Popen(arguments, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
output, error = process.communicate()
log_subprocess_output(StringIO(output))


I'd rewrite your code as:

我将您的代码重写为:

import shlex
import logging
import subprocess
from StringIO import StringIO

def run_shell_command(command_line):
    command_line_args = shlex.split(command_line)

    logging.info('Subprocess: "' + command_line + '"')

    try:
        command_line_process = subprocess.Popen(
            command_line_args,
            stdout=subprocess.PIPE,
            stderr=subprocess.STDOUT,
        )

        process_output, _ =  command_line_process.communicate()

        # process_output is now a string, not a file,
        # you may want to do:
        # process_output = StringIO(process_output)
        log_subprocess_output(process_output)
    except (OSError, CalledProcessError) as exception:
        logging.info('Exception occured: ' + str(exception))
        logging.info('Subprocess failed')
        return False
    else:
        # no exception was raised
        logging.info('Subprocess finished')

    return True

回答by jfs

You could try to pass the pipe directly without buffering the whole subprocess output in memory:

您可以尝试直接传递管道,而无需在内存中缓冲整个子进程输出:

from subprocess import Popen, PIPE, STDOUT

process = Popen(command_line_args, stdout=PIPE, stderr=STDOUT)
with process.stdout:
    log_subprocess_output(process.stdout)
exitcode = process.wait() # 0 means success

where log_subprocess_output()could look like:

哪里log_subprocess_output()看起来像:

def log_subprocess_output(pipe):
    for line in iter(pipe.readline, b''): # b'\n'-separated lines
        logging.info('got line from subprocess: %r', line)

回答by Anshuman Goel

I was trying to achieve the same on check_calland check_ouput. I found this solutionto be working.

我试图在check_call和上实现相同的目标check_ouput。我发现这个解决方案有效。

import logging
import threading
import os
import subprocess

logging.basicConfig(format='%(levelname)s:%(message)s', level=logging.INFO)

class LogPipe(threading.Thread):

    def __init__(self, level):
        """Setup the object with a logger and a loglevel
        and start the thread
        """
        threading.Thread.__init__(self)
        self.daemon = False
        self.level = level
        self.fdRead, self.fdWrite = os.pipe()
        self.pipeReader = os.fdopen(self.fdRead)
        self.start()

    def fileno(self):
        """Return the write file descriptor of the pipe"""
        return self.fdWrite

    def run(self):
        """Run the thread, logging everything."""
        for line in iter(self.pipeReader.readline, ''):
            logging.log(self.level, line.strip('\n'))

        self.pipeReader.close()

    def close(self):
        """Close the write end of the pipe."""
        os.close(self.fdWrite)

   def write(self):
       """If your code has something like sys.stdout.write"""
       logging.log(self.level, message)

   def flush(self):
       """If you code has something like this sys.stdout.flush"""
       pass

After implementing it, I performed the below steps:

实施后,我执行了以下步骤:

try:
    # It works on multiple handlers as well
    logging.basicConfig(handlers=[logging.FileHandler(log_file), logging.StreamHandler()])
    sys.stdout = LogPipe(logging.INFO)
    sys.stderr = LogPipe(logging.ERROR)
...
    subprocess.check_call(subprocess_cmd, stdout=sys.stdout, stderr=sys.stderr)
    export_output = subprocess.check_output(subprocess_cmd, stderr=sys.stderr)
...
finally:
    sys.stdout.close()
    sys.stderr.close()
    # It is neccessary to close the file handlers properly.
    sys.stdout = sys.__stdout__
    sys.stderr = sys.__stderr__
    logging.shutdown()
    os.remove(log_file)