在后台运行异步作业 (laravel)
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/32972319/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Running asynchronous jobs in the background (laravel)
提问by Ilyas Serter
I know Laravel's queue drivers such as redis and beanstalkd and I read that you can increase the number of workers for beanstalkd etc. However I'm just not sure if these solutions are right for my scenario. Here's what I need;
我知道 Laravel 的队列驱动程序,例如 redis 和 beanstalkd,我读到您可以增加 beanstalkd 等的工作人员数量。但是我不确定这些解决方案是否适合我的场景。这是我需要的;
I listen to an XML feed over a socket connection, and the data just keeps coming rapidly. forever. I get tens of XML documents in a second.
我通过套接字连接收听 XML 提要,并且数据一直在快速传输。永远。我在一秒钟内得到数十个 XML 文档。
I read data from this socket line by line, and once I get to the XML closing tag, I send the buffer to another process to be parsed. I used to just encode the xml in base64, and run a separate php process for each xml. shell_exec('php parse.php' . $base64XML);
我一行一行地从这个套接字读取数据,一旦我到达 XML 结束标记,我就会将缓冲区发送到另一个进程进行解析。我曾经只是在base64中对xml进行编码,并为每个xml运行一个单独的php进程。shell_exec('php parse.php' . $base64XML);
This allowed me to parse this never ending xml data quite rapidly. Sort of a manual threading. Now I'd like to utilize the same functionality with Laravel, but I wonder if there is a better way to do it. I believe Artisan::call('command') doesn't push it to the background. I could of course do a shell_exec within Laravel too, but I'd like to know if I can benefit from Beanstalkd or a similar solution.
这使我能够非常快速地解析这个永无止境的 xml 数据。有点像手动穿线。现在我想在 Laravel 中使用相同的功能,但我想知道是否有更好的方法来做到这一点。我相信 Artisan::call('command') 不会将它推到后台。我当然也可以在 Laravel 中执行 shell_exec,但我想知道我是否可以从 Beanstalkd 或类似的解决方案中受益。
So the real question is this:How can I set the number of queue workers for beanstalkd or redis drivers? Like I want 20 threads running at the same time. More if possible.
所以真正的问题是:如何为 beanstalkd 或 redis 驱动程序设置队列工作者的数量?就像我想要 20 个线程同时运行。如果可能的话,更多。
A slightly less important question is:How many threads is too many? If I had a very high-end dedicated server that can process the load just fine, would creating 500 threads/workers with these tools cause any problems on the code level?
一个不太重要的问题是:多少线程太多了?如果我有一个可以很好地处理负载的非常高端的专用服务器,使用这些工具创建 500 个线程/工人会导致代码级别的任何问题吗?
回答by Atrakeur
Well laravel queues are just made for that.
好吧,laravel 队列就是为此而生的。
Basicaly, you have to create a Job Class. All the heavy work you want to do on your xml document need to be here. Then, you fetch your xml out of the socket, and as soon as you have received one document, you push it on your Queue.
基本上,您必须创建一个Job Class。您想要在 xml 文档上执行的所有繁重工作都需要在这里完成。然后,您从套接字中取出您的 xml,一旦您收到一个文档,就将它推送到您的队列中。
Later, a queue worker will pick it up from the queue, and do the heavy work.
稍后,队列工作人员将从队列中取出它,并完成繁重的工作。
The advantage of that is that if you queue up documents faster than you work on them, the queue will take care of that high load moment and queue up tasks for later.
这样做的好处是,如果您将文档排入队列的速度比处理它们的速度快,则队列将处理高负载时刻并将任务排入队列以备后用。
I also don't recommend you to do it without a queue (with a fork like you did). In fact, if too much documents come in, you'll create too many childs threads and overload your server. Bookkeeping these threads correctly is risky and not worth it when a simple queue with a fixed number of workers solve all these problems out of the box).
我也不建议你在没有队列的情况下进行(像你一样使用叉子)。事实上,如果进入的文档过多,您将创建过多的子线程并使服务器过载。当一个具有固定数量工人的简单队列立即解决所有这些问题时,正确地记录这些线程是有风险的并且不值得这样做)。
回答by Ilyas Serter
After a little more research, I found how to set the number of worker processes. I had missed that part in the documentation. Silly me. I still wonder if this supervisor tool can handle hundreds of workers for situations like mine. Hopefully someone can share their experience, but if not I'll be updating this answer once I do a performance test this week.
经过更多的研究,我发现了如何设置工作进程的数量。我错过了文档中的那部分。傻我。我仍然想知道这个主管工具是否可以在像我这样的情况下处理数百名工人。希望有人可以分享他们的经验,但如果没有,我将在本周进行性能测试后更新此答案。
回答by mFlorin
I tell you from experience that shell_exec() is not the ideal way to run async tasks in PHP. Seems ok while developing, but if you have a small vps (1-2 GB ram) you could overload your server and apache/nginx/sql/something could brake while you're not around and your website could be down for hours / days.
我根据经验告诉您,shell_exec() 不是在 PHP 中运行异步任务的理想方式。在开发过程中似乎没问题,但是如果您有一个小的 vps(1-2 GB 内存),您可能会超载您的服务器,并且 apache/nginx/sql/当您不在时可能会停止运行,并且您的网站可能会关闭数小时/数天.
I recommend Laravel Queues + Scheduler for these kind of things.
我推荐 Laravel Queues + Scheduler 来处理这些事情。