从python子进程作业获取输出到龙卷风

从python子进程作业获取输出到龙卷风,第1张

概述我搜索了很多,但没有找到我如何将正在运行的 python子进程的输出转换为Tornado.我想要的是像 Travis CI.在管理页面我将开始工作,服务器将收到请求并启动子进程.此子进程将执行一些数据挖掘并使用一些日志提供字符串缓冲区.我将通过settimeout或websocket获取带有一些ajax的日志,并将此日志输出到页面中.即使用户关闭页面并稍后返回该页面,也会有日志,通常会更新.嗯,真 我搜索了很多,但没有找到我如何将正在运行的 python子进程的输出转换为Tornado.我想要的是像 Travis CI.在管理页面我将开始工作,服务器将收到请求并启动子进程.此子进程将执行一些数据挖掘并使用一些日志提供字符串缓冲区.我将通过settimeout或websocket获取带有一些AJAX的日志,并将此日志输出到页面中.即使用户关闭页面并稍后返回该页面,也会有日志,通常会更新.嗯,真的非常类似于特拉维斯.解决方法 这篇博文显示了一种方法: http://stefaanlippens.net/python-asynchronous-subprocess-pipe-reading

本质上,帖子显示了如何通过异步读取stdout和stderr来读取进程的输出时防止死锁.您可以从__main__替换producer命令来运行您喜欢的任何命令和带有代码的print语句来处理Tornado中的输出.

更新:我已经包含以下内容以防博客被删除:

…what if you want to read standard output and error line by line,
for example because you want to monitor a longer running process? On
the web you can find many solutions,with varying degrees of
complexity,abstraction and dependencIEs. One solution (with limited
code and no dependencIEs outsIDe the standard library) is to read the
pipes in separate threads,so one pipe can’t block another.

The code below shows an example implementation. The script is set up
in such a way that is used both for the parent as the child process.

For the child process: when called with ‘produce’ argument,it runs the produce() function that just renders some lines randomly on
standard output and standard error. Between the lines there is a touch
of delay simulate a longer running process.
The parent process (script called without arguments),implemented in the consume() function,invokes the same script in “child mode” as
subprocess and monitors its output line by line,without kNowing in
advance from which pipe each line will come.

The AsynchronousfileReader class is for the threads that will read the
standard output and error pipes asynchronously and put each line on a
queue. The main thread can then monitor the subprocess by watching the
lines as they come in on the queues.

import sysimport subprocessimport randomimport timeimport threadingimport Queueclass AsynchronousfileReader(threading.Thread):    '''    Helper class to implement asynchronous reading of a file    in a separate thread. Pushes read lines on a queue to    be consumed in another thread.    '''    def __init__(self,fd,queue):        assert isinstance(queue,Queue.Queue)        assert callable(fd.readline)        threading.Thread.__init__(self)        self._fd = fd        self._queue = queue    def run(self):        '''The body of the tread: read lines and put them on the queue.'''        for line in iter(self._fd.readline,''):            self._queue.put(line)    def eof(self):        '''Check whether there is no more content to expect.'''        return not self.is_alive() and self._queue.empty()def consume(command):    '''    Example of how to consume standard output and standard error of    a subprocess asynchronously without risk on deadlocking.    '''    # Launch the command as subprocess.    process = subprocess.Popen(command,stdout=subprocess.PIPE,stderr=subprocess.PIPE)    # Launch the asynchronous readers of the process' stdout and stderr.    stdout_queue = Queue.Queue()    stdout_reader = AsynchronousfileReader(process.stdout,stdout_queue)    stdout_reader.start()    stderr_queue = Queue.Queue()    stderr_reader = AsynchronousfileReader(process.stderr,stderr_queue)    stderr_reader.start()    # Check the queues if we received some output (until there is nothing more to get).    while not stdout_reader.eof() or not stderr_reader.eof():        # Show what we received from standard output.        while not stdout_queue.empty():            line = stdout_queue.get()            print 'Received line on standard output: ' + repr(line)        # Show what we received from standard error.        while not stderr_queue.empty():            line = stderr_queue.get()            print 'Received line on standard error: ' + repr(line)        # Sleep a bit before asking the readers again.        time.sleep(.1)    # Let's be tIDy and join the threads we've started.    stdout_reader.join()    stderr_reader.join()    # Close subprocess' file descriptors.    process.stdout.close()    process.stderr.close()def produce(items=10):    '''    Dummy function to randomly render a couple of lines    on standard output and standard error.    '''    for i in range(items):        output = random.choice([sys.stdout,sys.stderr])        output.write('line %d on %s\n' % (i,output))        output.flush()        time.sleep(random.uniform(.1,1))if __name__ == '__main__':    # The main flow:    # if there is an command line argument 'produce',act as a producer    # otherwise be a consumer (which launches a producer as subprocess).    if len(sys.argv) == 2 and sys.argv[1] == 'produce':        produce(10)    else:        consume(['python',sys.argv[0],'produce'])
总结

以上是内存溢出为你收集整理的从python子进程作业获取输出到龙卷风全部内容,希望文章能够帮你解决从python子进程作业获取输出到龙卷风所遇到的程序开发问题。

如果觉得内存溢出网站内容还不错,欢迎将内存溢出网站推荐给程序员好友。

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/langs/1205126.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-06-04
下一篇 2022-06-04

发表评论

登录后才能评论

评论列表(0条)

保存