Python Django 的多线程

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/18420699/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-19 10:41:27  来源:igfitidea点击:

Multithreading for Python Django

pythondjangomultithreadingdecoratorpython-multithreading

提问by tomcounsell

Some functions should run asynchronously on the web server. Sending emails or data post-processing are typical use cases.

某些功能应在 Web 服务器上异步运行。发送电子邮件或数据后处理是典型的用例。

What is the best (or most pythonic) way write a decorator function to run a function asynchronously?

编写装饰器函数以异步运行函数的最佳(或最pythonic)方法是什么?

My setup is a common one: Python, Django, Gunicorn or Waitress, AWS EC2 standard Linux

我的设置很常见:Python、Django、Gunicorn 或 Waitress、AWS EC2 标准 Linux

For example, here's a start:

例如,这是一个开始:

from threading import Thread

def postpone(function):
    def decorator(*args, **kwargs):
        t = Thread(target = function, args=args, kwargs=kwargs)
        t.daemon = True
        t.start()
    return decorator

desired usage:

所需用途:

@postpone
def foo():
    pass #do stuff

采纳答案by tomcounsell

I've continued using this implementation at scale and in production with no issues.

我一直在规模和生产中继续使用这个实现,没有任何问题。

Decorator definition:

装饰器定义:

def start_new_thread(function):
    def decorator(*args, **kwargs):
        t = Thread(target = function, args=args, kwargs=kwargs)
        t.daemon = True
        t.start()
    return decorator

Example usage:

用法示例:

@start_new_thread
def foo():
  #do stuff

Over time, the stack has updated and transitioned without fail.

随着时间的推移,堆栈不断更新和转换。

Originally Python 2.4.7, Django 1.4, Gunicorn 0.17.2, now Python 3.6, Django 2.1, Waitress 1.1.

最初是 Python 2.4.7、Django 1.4、Gunicorn 0.17.2,现在是 Python 3.6、Django 2.1、Waitress 1.1。

If you are using any database transactions, Django will create a new connection and this needs to be manually closed:

如果您正在使用任何数据库事务,Django 将创建一个新连接,这需要手动关闭:

from django.db import connection

@postpone
def foo():
  #do stuff
  connection.close()

回答by Thomas Orozco

The most common way to do asynchronous processing in Django is to use Celeryand django-celery.

在 Django 中进行异步处理的最常见方法是使用Celerydjango-celery.

回答by Glyn Hymanson

Celeryis an asynchronous task queue/job queue. It's well documented and perfect for what you need. I suggest you start here

Celery是一个异步任务队列/作业队列。它有据可查,非常适合您的需要。我建议你从这里开始

回答by bombs

tomcounsell's approach works well if there are not too many incoming jobs. If many long-lasting jobs are run in short period of time, therefore spawning a lot of threads, the main process will suffer. In this case, you can use a thread pool with a coroutine,

如果传入的工作不多,tomcounsell 的方法效果很好。如果在短时间内运行许多持久作业,从而产生大量线程,主进程将受到影响。在这种情况下,您可以使用带有协程的线程池,

# in my_utils.py

from concurrent.futures import ThreadPoolExecutor

MAX_THREADS = 10


def run_thread_pool():
    """
    Note that this is not a normal function, but a coroutine.
    All jobs are enqueued first before executed and there can be
    no more than 10 threads that run at any time point.
    """
    with ThreadPoolExecutor(max_workers=MAX_THREADS) as executor:
        while True:
            func, args, kwargs = yield
            executor.submit(func, *args, **kwargs)


pool_wrapper = run_thread_pool()

# Advance the coroutine to the first yield (priming)
next(pool_wrapper)
from my_utils import pool_wrapper

def job(*args, **kwargs):
    # do something

def handle(request):
    # make args and kwargs
    pool_wrapper.send((job, args, kwargs))
    # return a response