Python请求非阻塞?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/14245989/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Python Requests non-blocking?
提问by Jeff
Possible Duplicate:
Asynchronous Requests with Python requests
可能的重复:
带有 Python 请求的异步请求
Is the python module Requestsnon-blocking? I don't see anything in the docs about blocking or non-blocking.
python 模块请求是非阻塞的吗?我在文档中没有看到任何关于阻塞或非阻塞的内容。
If it is blocking, which module would you suggest?
如果它被阻塞,你会建议哪个模块?
采纳答案by abarnert
Like urllib2, requestsis blocking.
就像urllib2,requests正在阻塞。
But I wouldn't suggest using another library, either.
但我也不建议使用另一个库。
The simplest answer is to run each request in a separate thread. Unless you have hundreds of them, this should be fine. (How many hundreds is too many depends on your platform. On Windows, the limit is probably how much memory you have for thread stacks; on most other platforms the cutoff comes earlier.)
最简单的答案是在单独的线程中运行每个请求。除非你有数百个,否则这应该没问题。(多少数百是多少取决于您的平台。在 Windows 上,限制可能是线程堆栈有多少内存;在大多数其他平台上,截止时间更早。)
If you dohave hundreds, you can put them in a threadpool. The ThreadPoolExecutorExamplein the concurrent.futurespage is almost exactly what you need; just change the urllibcalls to requestscalls. (If you're on 2.x, use futures, the backport of the same packages on PyPI.) The downside is that you don't actually kick off all 1000 requests at once, just the first, say, 8.
如果您确实有数百个,则可以将它们放入线程池中。本ThreadPoolExecutor例中concurrent.futures的网页几乎正是你需要的; 只需将urllib通话更改为通话即可requests。(如果您使用的是 2.x,请使用futuresPyPI 上相同包的向后移植。)缺点是您实际上并没有一次启动所有 1000 个请求,只是第一个,例如 8 个。
If you have hundreds, and they all need to be in parallel, this sounds like a job for gevent. Have it monkeypatch everything, then write the exact same code you'd write with threads, but spawning greenlets instead of Threads.
如果您有数百个,并且它们都需要并行,那么这听起来像是gevent. 让它对所有内容进行猴子补丁,然后编写与使用线程编写的完全相同的代码,但生成的greenlet是 s 而不是Threads。
grequests, which evolved out of the old async support directly in requests, effectively does the gevent+ requestswrapping for you. And for the simplest cases, it's great. But for anything non-trivial, I find it easier to read explicit geventcode. Your mileage may vary.
grequests,它直接从旧的异步支持演变而来requests,有效地为您进行gevent+requests包装。对于最简单的情况,这很棒。但是对于任何重要的事情,我发现阅读显式gevent代码更容易。你的旅费可能会改变。
Of course if you need to do something reallyfancy, you probably want to go to twisted, tornado, or tulip(or wait a few months for tulipto be part of the stdlib).
当然,如果您需要做一些真正奇特的事情,您可能想去twisted, tornado, 或tulip(或等待几个月tulip才能成为 stdlib 的一部分)。
回答by Adam
It is blocking, but this reminded me of a kind of a neat little wrapper I guy I know put around gevent, which fell back to eventlet, and then threads if neither of those two were present. You can add functions to data structures that resemble either dicts or lists and as soon as the functions are added they are executed in the background and have the values returned from the functions be available in place of the functions as soon as they're done executing. It's here.
它是阻塞的,但这让我想起了一种整洁的小包装,我知道我在 gevent 周围放了一个小包装,它回退到 eventlet,然后如果这两个都不存在,则使用线程。您可以将函数添加到类似于字典或列表的数据结构中,一旦添加了这些函数,它们就会在后台执行,并且一旦它们完成执行,就可以使用从函数返回的值来代替函数. 它在这里。

