Python-多处理守护进程
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/27494725/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Python- Multiprocessing Daemon
提问by PythonEnthusiast
I'm creating a multiprocess, which creates a csv file. When I run the code with d.daemon = False
it works fine, ie it creates a file in the same folder. But when compiled and run with d.daemon = True
, it does not, ie does not creates a file. Why's so?
我正在创建一个多进程,它创建一个 csv 文件。当我用d.daemon = False
它运行代码时,它工作正常,即它在同一个文件夹中创建了一个文件。但是当用 编译和运行时d.daemon = True
,它不会,即不会创建文件。为什么会这样?
My Code
我的代码
I've a seed list of URLs from which I need to scrap the data.
我有一个 URL 种子列表,我需要从中获取数据。
for url in config.SEED_LIST:
# starting a new process for each category.
d = multiprocessing.Process(target=workers.scrap, args=())
d.daemon = True
d.start()
def scrap():
import time
time.sleep(5)
# The above part of code takes some time to scrap a webpage, applying
# some logic, which takes some time to execute, hence I've added a time
# sleep of 5 secs. But when run with daemon = True, the file is not
# created. Else it works fine.
data = [[1, 2, 3, 4], [2224, 34, 34, 34, 34]]
with open('1.csv', "wb") as f:
writer = csv.writer(f)
writer.writerows(data)
采纳答案by Michele d'Amico
According to multiprocess daemon documentationby setting d.daemon=True
when your script ends its job will kill all subprocess. That occurs before they can start to write so no output will be produced.
根据多进程守护进程文档,通过设置d.daemon=True
脚本结束其作业将终止所有子进程。这发生在他们开始写入之前,因此不会产生任何输出。