Laravel 作业队列未使用 Redis 驱动程序处理

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/34012819/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-14 12:45:21  来源:igfitidea点击:

Laravel Job Queue not processing using Redis driver

phplaravelredispublish-subscribe

提问by ReactingToAngularVues

I'm creating a job, pushing it on to a custom queue, and trying to use the Redis driver to then handle the job when it hits the queue, without success:

我正在创建一个作业,将其推送到自定义队列,并尝试使用 Redis 驱动程序在它到达队列时处理该作业,但没有成功:

class MyController extends Controller {
    public function method() {
        $job = (new UpdateLiveThreadJob())->onQueue('live');
        $this->dispatch($job);
    }
}

Here is my queue config:

这是我的队列配置:

    'default' => env('QUEUE_DRIVER'),

    'redis' => [
        'driver' => 'redis',
        'connection' => 'default',
        'queue'  => 'default',
        'expire' => 60,
    ],

Here is my .envfile:

这是我的.env文件:

# Drivers (Queues & Broadcasts)
CACHE_DRIVER=file
SESSION_DRIVER=file
QUEUE_DRIVER=redis
BROADCAST_DRIVER=redis

Here's my job:

这是我的工作:

class UpdateLiveThreadJob extends Job implements SelfHandling, ShouldQueue
{
    /**
     * Create a new job instance.
     *
     * @return void
     */
    public function __construct()
    {
    }

    /**
     * Execute the job.
     *
     * @return void
     */
    public function handle()
    {
        // Rerender content
        $templatedOutput = view('templates.livethreadcontents')->with([
            'updates' => collect(Redis::lrange('live:updates', 0, -1))->reverse()->map(function($update) {
                return json_decode($update);
            })
        ])->render();

        // Connect to external service

        // Update Thread
    }
}

Indeed, I can change the handlemethod to do nothing to ensure it's nothing in the job that's actually causing it to fail, and it still doesn't process:

确实,我可以将handle方法更改为不执行任何操作,以确保作业中没有任何内容导致它失败,并且它仍然不处理:

    /**
     * Execute the job.
     *
     * @return void
     */
    public function handle()
    {
        print_r('test');
    }

Using Redis, I can see it's pushed onto the queue:

使用 Redis,我可以看到它被推送到队列中:

> lrange queues:live 0 -1
> // json encoded job present
> llen queues:live
> // shows there is a job in the queue

Yet, it never actually fires, to my knowledge. Watching php artisan queue:listenshows nothing (only unrelated event broadcasts). What's going on here?

然而,据我所知,它从未真正触发过。观看php artisan queue:listen什么都不显示(只有不相关的事件广播)。这里发生了什么?

回答by marcus.ramsden

With Laravel 5.3 there were changes to queues. Now you would run php artisan queue:work --queue=liveand that should do what you need.

在 Laravel 5.3 中,队列发生了变化。现在您将运行php artisan queue:work --queue=live,这应该可以满足您的需求。

I've left my original answer below.

我在下面留下了我的原始答案。



Are you remembering to run php artisan queue:listen --queue=live?

你还记得跑php artisan queue:listen --queue=live吗?

You need to define the queue name when running the listen command otherwise you end up only listening to the defaultqueue.

您需要在运行 listen 命令时定义队列名称,否则您最终只会监听default队列。

If you want to run multiple queues and managing things in production you can use something like Upstart(not directly related to setting up Laravel queues, but provides a good starting point) or Supervisorto manage the processes. Both of these are available on Forge and Homestead.

如果您想运行多个队列并在生产中管理事物,您可以使用Upstart(与设置 Laravel 队列没有直接关系,但提供了一个很好的起点)或Supervisor来管理流程。这两个都可以在 Forge 和 Homestead 上找到。

Finally assuming you are on Laravel 5 you may want to consider running php artisan queue:work --daemon --queue=liveas this reduces the CPU overhead of running the worker as it doesn't reload the framework with each job. But you must remember to restart the worker when you deploy new code for your jobs otherwise things won't be picked up.

最后假设您在 Laravel 5 上,您可能需要考虑运行,php artisan queue:work --daemon --queue=live因为这减少了运行工作程序的 CPU 开销,因为它不会为每个作业重新加载框架。但是您必须记住在为您的作业部署新代码时重新启动工作程序,否则将无法使用。