python Python中的异步后台进程?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/2496772/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Asynchronous background processes in Python?
提问by Geuis
I have been using this as a reference, but not able to accomplish exactly what I need: Calling an external command in Python
我一直在使用它作为参考,但无法完全完成我所需要的:Calling an external command in Python
I also was reading this: http://www.python.org/dev/peps/pep-3145/
我也在读这个:http: //www.python.org/dev/peps/pep-3145/
For our project, we have 5 svn checkouts that need to update before we can deploy our application. In my dev environment, where speedy deployments are a bit more important for productivity than a production deployment, I have been working on speeding up the process.
对于我们的项目,我们有 5 个 svn checkout 需要更新,然后才能部署我们的应用程序。在我的开发环境中,快速部署比生产部署对生产力更重要,我一直在努力加快流程。
I have a bash script that has been working decently but has some limitations. I fire up multiple 'svn updates' with the following bash command:
我有一个运行良好但有一些限制的 bash 脚本。我使用以下 bash 命令启动多个“svn 更新”:
(svn update /repo1) & (svn update /repo2) & (svn update /repo3) &
These all run in parallel and it works pretty well. I also use this pattern in the rest of the build script for firing off each ant build, then moving the wars to Tomcat.
这些都是并行运行的,而且效果很好。我还在构建脚本的其余部分使用此模式来触发每个 ant 构建,然后将War移至 Tomcat。
However, I have no control over stopping deployment if one of the updates or a build fails.
但是,如果其中一个更新或构建失败,我无法控制停止部署。
I'm re-writing my bash script with Python so I have more control over branches and the deployment process.
我正在用 Python 重写我的 bash 脚本,因此我可以更好地控制分支和部署过程。
I am using subprocess.call() to fire off the 'svn update /repo' commands, but each one is acting sequentially. I try '(svn update /repo) &' and those all fire off, but the result code returns immediately. So I have no way to determine if a particular command fails or not in the asynchronous mode.
我正在使用 subprocess.call() 来触发 'svn update /repo' 命令,但每个命令都是按顺序执行的。我尝试 '(svn update /repo) &' 并且所有这些都被触发,但结果代码立即返回。所以我无法确定特定命令在异步模式下是否失败。
import subprocess
subprocess.call( 'svn update /repo1', shell=True )
subprocess.call( 'svn update /repo2', shell=True )
subprocess.call( 'svn update /repo3', shell=True )
I'd love to find a way to have Python fire off each Unix command, and if any of the calls fails at any time the entire script stops.
我很想找到一种方法让 Python 触发每个 Unix 命令,如果任何调用在任何时候失败,整个脚本都会停止。
回答by nosklo
Don't use shell=True
. It will needlessy invoke the shell to call your svn
program, and that will give you the shell's return code instead of svn's.
不要使用shell=True
. 它将不必要地调用外壳svn
程序来调用您的程序,这将为您提供外壳程序的返回代码而不是 svn 的返回代码。
repos = ['/repo1', '/repo2', '/repo3']
# launch 3 async calls:
procs = [subprocess.Popen(['svn', 'update', repo]) for repo in repos]
# wait.
for proc in procs:
proc.wait()
# check for results:
if any(proc.returncode != 0 for proc in procs):
print 'Something failed'