Python 如何保持多个独立的芹菜队列?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/19853378/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-19 14:48:55  来源:igfitidea点击:

How to keep multiple independent celery queues?

pythoncelery

提问by jwoww

I'm trying to keep multiple celery queues with different tasks and workers in the same redis database. Really just a convenience issue of only wanting one redis server rather than two on my machine.

我试图在同一个 redis 数据库中保留多个具有不同任务和工作人员的 celery 队列。真的只是一个方便的问题,我的机器上只需要一个 redis 服务器而不是两个。

I followed the celery tutorial docs verbatim, as it as the only way to get it to work for me. Now when I try to duplicate everything with slightly tweaked names/queues, it keeps erroring out.

我逐字跟踪芹菜教程文档,因为这是让它为我工作的唯一方法。现在,当我尝试使用稍微调整的名称/队列复制所有内容时,它不断出错。

Note - I'm a newish to Python and Celery, which is obviously part of the problem. I'm not sure which parts are named "task/tasks" as a name vs special words.

注意 - 我是 Python 和 Celery 的新手,这显然是问题的一部分。我不确定哪些部分被命名为“任务/任务”作为名称与特殊词。

My condensed version of docs: Run celery -A tasks workerto spawn the workers. tasks.py contains task code with celery = Celery('tasks', broker='redis://localhost')to connect to Celery and @task()above my functions that I want to delay.

我的文档精简版:运行celery -A tasks worker以生成工人。tasks.py 包含celery = Celery('tasks', broker='redis://localhost')用于连接到 Celery 的任务代码以及@task()我想要延迟的函数。

Within my program for queueing tasks...

在我的排队任务程序中......

from tasks import do_work
do_work.delay()

So given all of the above, what are the steps I need to take to turn this into two types of tasks that run independently on separate queues and workers? For example, blue_tasks and red_tasks?

因此,鉴于上述所有内容,我需要采取哪些步骤才能将其转变为两种类型的任务,这些任务在单独的队列和工作人员上独立运行?例如,blue_tasks 和 red_tasks?

I've tried changing all instances of tasks to blue_tasks or red_tasks. However, when I queue blue_tasks, the red_tasks workers I've started up start trying to work on them.

我尝试将所有任务实例更改为 blue_tasks 或 red_tasks。但是,当我排队 blue_tasks 时,我启动的 red_tasks 工作人员开始尝试处理它们。

I read about default queues and such, so I tried this code, which didn't work:

我阅读了默认队列等内容,所以我尝试了这段代码,但没有用:

CELERY_DEFAULT_QUEUE = 'red'
CELERY_QUEUES = (
    Queue('red', Exchange('red'), routing_key='red'),
)

As a side note, I don't understand why celery workererrors out with celery attempting to connect to a default amqp instance, while celery -A tasks workertells celery to connect to Redis. What task code is celery workerattempting to run on the worker if nothing has been specified?

作为旁注,我不明白为什么celery workercelery 尝试连接到默认 amqp 实例时出错,而celery -A tasks worker告诉 celery 连接到 Redis。celery worker如果没有指定什么任务代码试图在工作器上运行?

采纳答案by dbr

By default everything goes into a default queue named celery(and this is what celery workerwill process if no queue is specified)

默认情况下,所有内容都进入一个名为的默认队列celerycelery worker如果未指定队列,这将处理)

So say you have your do_worktask function in django_project_root/myapp/tasks.py.

因此,假设您do_workdjango_project_root/myapp/tasks.py.

You could configure the do_worktask to live in it's own queue like so:

您可以将do_work任务配置为存在于它自己的队列中,如下所示:

CELERY_ROUTES = {
    'myproject.tasks.do_work': {'queue': 'red'},
}

Then run a worker using celery worker -Q redand it will only process things in that queue (another worker invoked with celery workerwill only pickup things in the default queue)

然后运行一个工作程序celery worker -Q red,它只会处理该队列中的内容(另一个被调用的工作程序celery worker只会在默认队列中提取内容)

The task routingsection in the documentation should explain all.

文档中的任务路由部分应该解释所有。

回答by josepainumkal

To link to different queue dynamically, follow the below steps:

要动态链接到不同的队列,请按照以下步骤操作:

1) Specify the name of the queue with the 'queue' attribute

1) 使用'queue'属性指定队列的名称

celery.send_task('job1', args=[], kwargs={}, queue='queue_name_1')
celery.send_task('job1', args=[], kwargs={}, queue='queue_name_2')

(Here a particular job uses two queues)

(此处特定作业使用两个队列)

2) Add the following entry in the configuration file

2)在配置文件中添加如下条目

CELERY_CREATE_MISSING_QUEUES = True

3) While starting the worker, use -Q to specify the queue name' from which the jobs to be consumed

3)在启动worker时,使用-Q指定要消耗的作业的队列名称

celery -A proj worker -l info -Q queue1 
celery -A proj worker -l info -Q queue2