python python的作业队列实现

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/1336489/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-11-03 21:59:22  来源:igfitidea点击:

job queue implementation for python

pythonjob-queue

提问by

Do you know/use any distributed job queue for python? Can you share links or tools

你知道/使用 python 的任何分布式作业队列吗?你能分享链接或工具吗

采纳答案by Vinay Sajip

In addition to multiprocessing there's also the Celeryproject, if you're using Django.

除了多处理之外,还有Celery项目(如果您使用的是 Django)。

回答by optixx

Pyres is a resque clone built in python. Resque is used by Github as their message queue. Both use Redis as the queue backend and provide a web-based monitoring application.

Pyres 是一个用 python 构建的 resque 克隆。Github 使用 Resque 作为他们的消息队列。两者都使用 Redis 作为队列后端并提供基于 Web 的监控应用程序。

http://binarydud.github.com/pyres/intro.html

http://binarydud.github.com/pyres/intro.html

回答by Michael Sparks

There's also "bucker" by Sylvain Hellegouarch which you can find here:

还有 Sylvain Hellegouarch 的“bucker”,你可以在这里找到:

It describes itself like this:

它是这样描述自己的:

  • bucker is a queue system that supports multiple storage for the queue (memcached, Amazon SQS for now) and is driven by XML messages sent over a TCP connections between a client and the queue server.
  • bucker 是一个队列系统,它支持队列的多个存储(memcached,现在是 Amazon SQS),并由通过客户端和队列服务器之间的 TCP 连接发送的 XML 消息驱动。

回答by nos

Look at beanstalkd

看看beanstalkd

回答by superisaac

redqueue? It's implemented in python+tornado framework, speaks memcached protocol and is optionally persistent into log files. Currently it is also able to behave like beanstalkd, the reserve/delete way in memcache protocol as well.

红队?它在 python+tornado 框架中实现,使用 memcached 协议,并且可以选择持久化到日志文件中。目前,它也可以像 beanstalkd 一样运行,也就是 memcache 协议中的保留/删除方式。

REDQUEUE

红队

回答by versale

If you think that Celery is too heavy for your needs then you might want to look at the simple distributed task queue:

如果您认为 Celery 对您的需求来说太重了,那么您可能需要查看简单的分布式任务队列:

回答by Morgan

It's a year late or whatever, but this is something I've hacked together to make a queue of Processes executing them only X number at a time. http://github.com/goosemo/job_queue

晚了一年或什么的,但这是我一起黑客攻击的东西,使进程队列一次只执行 X 个编号。http://github.com/goosemo/job_queue

回答by djc

You probably want to look at multiprocessing's Queue. Included in Python 2.6, get it on PyPI for earlier versions of Python.

您可能想查看多处理的队列。包含在 Python 2.6 中,请在 PyPI 上获取它以获取早期版本的 Python。

Standard library documentation: http://docs.python.org/library/multiprocessing.htmlOn PyPI: http://pypi.python.org/pypi/multiprocessing

标准库文档:http://docs.python.org/library/multiprocessing.html PyPI 上:http: //pypi.python.org/pypi/multiprocessing

回答by mac2017

Also there is Unix 'at'

还有 Unix 'at'

For more info: man at

欲了解更多信息: man at