Python 异步回调和生成器

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/1805958/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-11-03 23:05:28  来源:igfitidea点击:

Python asynchronous callbacks and generators

pythonasynchronousgenerator

提问by spinlock

I'm trying to convert a synchronous library to use an internal asynchronous IO framework. I have several methods that look like this:

我正在尝试将同步库转换为使用内部异步 IO 框架。我有几种看起来像这样的方法:

def foo:
  ....
  sync_call_1()   # synchronous blocking call
  ....
  sync_call_2()   # synchronous blocking call
  ....
  return bar

For each of the synchronous functions (sync_call_*), I have written a corresponding async function that takes a a callback. E.g.

对于每个同步函数 ( sync_call_*),我都编写了一个相应的异步函数,它接受一个回调。例如

def async_call_1(callback=none):
  # do the I/O
  callback()

Now for the python newbie question -- whats the easiest way to translate the existing methods to use these new async methods instead? That is, the method foo()above needs to now be:

现在对于 python 新手问题——将现有方法转换为使用这些新的异步方法的最简单方法是什么?也就是说,foo()上面的方法现在需要是:

def async_foo(callback):
  # Do the foo() stuff using async_call_*
  callback()

One obvious choice is to pass a callback into each async method which effectively "resumes" the calling "foo" function, and then call the callback global at the very end of the method. However, that makes the code brittle, ugly and I would need to add a new callback for every call to an async_call_*method.

一个明显的选择是将回调传递到每个异步方法中,从而有效地“恢复”调用“foo”函数,然后在方法的最后调用全局回调。然而,这使得代码变得脆弱、丑陋,我需要为每次调用async_call_*方法添加一个新的回调。

Is there an easy way to do that using a python idiom, such as a generator or coroutine?

有没有一种简单的方法可以使用 python 习语来做到这一点,比如生成器或协程?

回答by Beni Cherniavsky-Paskin

UPDATE: take this with a grain of salt, as I'm out of touch with modern python async developments, including geventand asyncioand don't actually have serious experience with async code.

更新:对此保留态度,因为我不了解现代 Python 异步开发,包括geventasyncio,并且实际上对异步代码没有真正的经验。



There are 3 common approaches to thread-less async coding in Python:

Python 中有 3 种常见的无线程异步编码方法:

  1. Callbacks - ugly but workable, Twisted does this well.

  2. Generators - nice but require allyour code to follow the style.

  3. Use Python implementation with real tasklets - Stackless (RIP) and greenlet.

  1. 回调 - 丑陋但可行,Twisted 做得很好。

  2. 生成器 - 不错,但需要您的所有代码都遵循这种风格。

  3. 将 Python 实现与真正的 tasklets - Stackless (RIP) 和greenlet 一起使用

Unfortunately, ideally the whole program should use one style, or things become complicated. If you are OK with your library exposing a fully synchronous interface, you are probably OK, but if you want several calls to your library to work in parallel, especially in parallel with otherasync code, then you need a common event "reactor" that can work with all the code.

不幸的是,理想情况下,整个程序应该使用一种风格,否则事情会变得复杂。如果你对你的库公开一个完全同步的接口没问题,你可能没问题,但是如果你希望对你的库的多个调用并行工作,尤其是与其他异步代码并行,那么你需要一个公共事件“反应器”可以使用所有代码。

So if you have (or expect the user to have) other async code in the application, adopting the same model is probably smart.

因此,如果您在应用程序中有(或期望用户有)其他异步代码,采用相同的模型可能是明智的。

If you don't want to understand the whole mess, consider using bad old threads. They are also ugly, but work with everything else.

如果您不想了解整个混乱局面,请考虑使用糟糕的旧线程。它们也很丑,但可以与其他所有东西一起使用。

If you do want to understand how coroutines might help you - and how they might complicate you, David Beazley's "A Curious Course on Coroutines and Concurrency"is good stuff.

如果您确实想了解协程如何帮助您 - 以及它们如何使您复杂化,David Beazley 的“关于协程和并发的好奇课程”是个好东西。

Greenletsmight be actualy the cleanest way if you can use the extension. I don't have any experience with them, so can't say much.

如果您可以使用扩展程序,Greenlets 实际上可能是最干净的方式。我对他们没有任何经验,所以不能说太多。

回答by Anand Chitipothu

You need to make function fooalso async. How about this approach?

您需要使功能foo也异步。这种方法怎么样?

@make_async
def foo(somearg, callback):
    # This function is now async. Expect a callback argument.
    ...

    # change 
    #       x = sync_call1(somearg, some_other_arg)
    # to the following:
    x = yield async_call1, somearg, some_other_arg
    ...

    # same transformation again
    y = yield async_call2, x
    ...

    # change
    #     return bar
    # to a callback call
    callback(bar)

And make_asynccan be defined like this:

并且make_async可以这样定义:

def make_async(f):
    """Decorator to convert sync function to async
    using the above mentioned transformations"""
    def g(*a, **kw):
        async_call(f(*a, **kw))
    return g

def async_call(it, value=None):
    # This function is the core of async transformation.

    try: 
        # send the current value to the iterator and
        # expect function to call and args to pass to it
        x = it.send(value)
    except StopIteration:
        return

    func = x[0]
    args = list(x[1:])

    # define callback and append it to args
    # (assuming that callback is always the last argument)

    callback = lambda new_value: async_call(it, new_value)
    args.append(callback)

    func(*args)

CAUTION: I haven't tested this

注意:我没有测试过这个

回答by Denis Otkidach

There are several way for multiplexing tasks. We can't say what is the best for your case without deeper knowledge on what you are doing. Probably the most easiest/universal way is to use threads. Take a look at this questionfor some ideas.

多路复用任务有多种方式。如果没有更深入地了解您正在做什么,我们就无法说出最适合您的情况。可能最简单/通用的方法是使用线程。看看这个问题的一些想法。