Python 异步两个循环用于不同的 I/O 任务?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/31623194/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Asyncio two loops for different I/O tasks?
提问by brunoop
I am using Python3 Asyncio module to create a load balancing application. I have two heavy IO tasks:
我正在使用 Python3 Asyncio 模块来创建负载平衡应用程序。我有两个繁重的 IO 任务:
- A SNMP polling module, which determines the best possible server
- A "proxy-like" module, which balances the petitions to the selected server.
- 一个 SNMP 轮询模块,用于确定可能的最佳服务器
- 一个“类代理”模块,用于平衡对所选服务器的请求。
Both processes are going to run forever, are independent from eachother and should not be blocked by the other one.
两个进程都将永远运行,彼此独立,不应被另一个进程阻塞。
I cant use 1 event loop because they would block eachother, is there any way to have 2 event loops or do I have to use multithreading/processing?
我不能使用 1 个事件循环,因为它们会互相阻塞,有什么办法可以有 2 个事件循环还是我必须使用多线程/处理?
I tried using asyncio.new_event_loop() but havent managed to make it work.
我尝试使用 asyncio.new_event_loop() 但还没有设法让它工作。
回答by Nihal Sharma
Asyncio event loop is a single thread running and it will not run anything in parallel, it is how it is designed. The closest thing which I can think of is using asyncio.wait
.
Asyncio 事件循环是一个单线程运行,它不会并行运行任何东西,这就是它的设计方式。我能想到的最接近的事情是使用asyncio.wait
.
from asyncio import coroutine
import asyncio
@coroutine
def some_work(x, y):
print("Going to do some heavy work")
yield from asyncio.sleep(1.0)
print(x + y)
@coroutine
def some_other_work(x, y):
print("Going to do some other heavy work")
yield from asyncio.sleep(3.0)
print(x * y)
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.wait([asyncio.async(some_work(3, 4)),
asyncio.async(some_other_work(3, 4))]))
loop.close()
an alternate way is to use asyncio.gather()
- it returns a future results from the given list of futures.
另一种方法是使用asyncio.gather()
- 它从给定的期货列表中返回未来的结果。
tasks = [asyncio.Task(some_work(3, 4)), asyncio.Task(some_other_work(3, 4))]
loop.run_until_complete(asyncio.gather(*tasks))
回答by brunoop
Answering my own question to post my solution:
回答我自己的问题以发布我的解决方案:
What I ended up doing was creating a thread and a new event loop inside the thread for the polling module, so now every module runs in a different loop. It is not a perfect solution, but it is the only one that made sense to me(I wanted to avoid threads, but since it is only one...). Example:
我最终做的是在线程内为轮询模块创建一个线程和一个新的事件循环,所以现在每个模块都在不同的循环中运行。这不是一个完美的解决方案,但它是唯一对我有意义的解决方案(我想避免线程,但因为它只有一个......)。例子:
import asyncio
import threading
def worker():
second_loop = asyncio.new_event_loop()
execute_polling_coroutines_forever(second_loop)
return
threads = []
t = threading.Thread(target=worker)
threads.append(t)
t.start()
loop = asyncio.get_event_loop()
execute_proxy_coroutines_forever(loop)
Asyncio requires that every loop runs its coroutines in the same thread. Using this method you have one event loop foreach thread, and they are totally independent: every loop will execute its coroutines on its own thread, so that is not a problem. As I said, its probably not the best solution, but it worked for me.
Asyncio 要求每个循环在同一线程中运行其协程。使用这种方法,每个线程都有一个事件循环,它们是完全独立的:每个循环都将在自己的线程上执行其协程,所以这不是问题。正如我所说,它可能不是最好的解决方案,但它对我有用。
回答by Markus Bergkvist
If the proxy server is running all the time it cannot switch back and forth. The proxy listens for client requests and makes them asynchronous, but the other task cannot execute, because this one is serving forever.
如果代理服务器一直在运行,它就不能来回切换。代理侦听客户端请求并使它们异步,但另一个任务无法执行,因为这个任务永远在服务。
If the proxy is a coroutine and is starving the SNMP-poller (never awaits), isn't the client requests being starved aswell?
如果代理是一个协程并且正在耗尽 SNMP 轮询器(从不等待),那么客户端请求是不是也处于饥饿状态?
every coroutine will run forever, they will not end
每个协程将永远运行,它们不会结束
This should be fine, as long as they do await/yield from
. The echo serverwill also run forever, it doesn't mean you can't run several servers (on differents ports though) in the same loop.
这应该没问题,只要他们这样做await/yield from
。该回声服务器也将永远运行下去,这并不意味着你不能(在型动物端口虽然)在同一回路运行多个服务器。
回答by Hamed_gibago
But I used that like this, but still its synchronous no async:
但是我这样使用它,但它仍然是同步的,没有异步:
def main(*args):
loop = get_event_loop()
coro = asyncio.start_server(handle_echo, '127.0.0.1', 50008,loop=loop)
srv = loop.run_until_complete(coro)
loop.run_forever()
@asyncio.coroutine
def handle_echo(reader, writer):
data = yield from reader.read(500)
message = data.decode(encoding='utf-8')
nameindex=('name="calculator2"' in message)
if nameindex:
time.sleep(5)
writer.write("Content-Length: 1\r\n\r\n2".encode())
yield from writer.drain()
else:
writer.write("Content-Length: 1\r\n\r\n1".encode())
yield from writer.drain()
print("Close the client socket")
writer.close()
if received value contains (name="calculator2") I wait for 5 seconds if not, just answer and write data immediately. But when test it, first send data to server with containing (name="calculator2") and next data without (name="calculator2"), but next data handles after 5 seconds of first is done and after that 2th data will be handled.
如果接收到的值包含 (name="calculator2") 我等待 5 秒如果没有,立即回答并写入数据。但是在测试时,首先将数据发送到服务器,其中包含 (name="calculator2") 和下一个数据不包含 (name="calculator2"),但是在第一个完成后 5 秒后处理下一个数据,然后将处理第二个数据.
its sequential. what it wrong with it? and the other way, how should I get client connected ip and port?
它的顺序。它有什么问题吗?另一方面,我应该如何让客户端连接 ip 和端口?
回答by kissgyorgy
The whole point of asyncio
is that you can run multiple thousands of I/O-heavy tasks concurrently, so you don't need Thread
s at all, this is exactly what asyncio
is made for. Just run the two coroutines (SNMP and proxy) in the same loop and that's it.
You have to make both of them available to the event loop BEFORE calling loop.run_forever()
. Something like this:
重点asyncio
是您可以同时运行数以千计的 I/O 密集型任务,因此您根本不需要Thread
s,这正是asyncio
为此而设计的。只需在同一个循环中运行两个协程(SNMP 和代理)就可以了。在调用loop.run_forever()
. 像这样的东西:
import asyncio
async def snmp():
print("Doing the snmp thing")
await asyncio.sleep(1)
async def proxy():
print("Doing the proxy thing")
await asyncio.sleep(2)
async def main():
while True:
await snmp()
await proxy()
loop = asyncio.get_event_loop()
loop.create_task(main())
loop.run_forever()
I don't know the structure of your code, so the different modules might have their own infinite loop or something, in this case you can run something like this:
我不知道你的代码的结构,所以不同的模块可能有自己的无限循环或其他东西,在这种情况下你可以运行这样的东西:
import asyncio
async def snmp():
while True:
print("Doing the snmp thing")
await asyncio.sleep(1)
async def proxy():
while True:
print("Doing the proxy thing")
await asyncio.sleep(2)
loop = asyncio.get_event_loop()
loop.create_task(snmp())
loop.create_task(proxy())
loop.run_forever()
Remember, both snmp
and proxy
needs to be coroutines (async def
) written in an asyncio-aware manner. asyncio
will not make simple blocking Python functions suddenly "async".
请记住,snmp
和 都proxy
需要以async def
异步感知方式编写的协程()。asyncio
不会让简单的阻塞 Python 函数突然“异步”。
In your specific case, I suspect that you are confused a little bit (no offense!), because well-written async modules will never block each other in the same loop. If this is the case, you don't need asyncio
at all and just simply run one of them in a separate Thread
without dealing with any asyncio
stuff.
在您的具体情况下,我怀疑您有点困惑(无意冒犯!),因为编写良好的异步模块永远不会在同一个循环中相互阻塞。如果是这种情况,您根本不需要asyncio
,只需单独运行其中一个Thread
而不处理任何asyncio
内容。