laravel 限制 Eloquent 块
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/39029449/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Limiting Eloquent chunks
提问by eComEvo
I have a very large result set to process and so I'm using the chunk()
method to reduce the memory footprint of the job. However, I only want to process a certain number of total results to prevent the job from running too long.
我有一个非常大的结果集要处理,所以我正在使用该chunk()
方法来减少作业的内存占用。但是,我只想处理一定数量的总结果,以防止作业运行时间过长。
Currently I'm doing this, but it does not seem like an elegant solution:
目前我正在这样做,但这似乎不是一个优雅的解决方案:
$count = 0;
$max = 1000000;
$lists = Lists::whereReady(true);
$lists->chunk(1000, function (Collection $lists) use (&$count, $max) {
if ($count >= $max)
return;
foreach ($lists as $list) {
if ($count >= $max)
break;
$count++;
// ...do stuff
}
});
Is there a cleaner way to do this?
有没有更干净的方法来做到这一点?
采纳答案by patricus
As of right now, I don't believe so.
就目前而言,我不相信。
There have been some issues and pull requests submitted to have chunk respect previously set skip/limits, but Taylor has closed them as expected behavior that chunk overwrites these.
有一些问题和拉取请求提交给块尊重先前设置的跳过/限制,但泰勒已经关闭它们作为块覆盖这些的预期行为。
There is currently an open issuein the laravel/internals repo where he said he'd take a look again, but I don't think it is high on the priority list. I doubt it is something he would work on, but may be more receptive to another pull request now.
目前在 laravel/internals 存储库中有一个未解决的问题,他说他会再看看,但我认为它在优先级列表中并不高。我怀疑这是他会做的事情,但现在可能更容易接受另一个拉取请求。
Your solution looks fine, except for one thing. chunk()
will end up reading your entire table, unless you return false
from your closure. Currently, you are just returning null
, so even though your "max" is set to 1000000, it will still read the entire table. If you return false
from your closure when $count >= $max
, chunk()
will stop querying the database. It will cause chunk()
to return false itself, but your example code doesn't care about the return of chunk()
anyway, so that's okay.
您的解决方案看起来不错,除了一件事。chunk()
将最终阅读您的整个表格,除非您false
从关闭中返回。目前,您刚刚返回null
,因此即使您的“max”设置为 1000000,它仍会读取整个表。如果你return false
从你的关闭时$count >= $max
,chunk()
将停止查询数据库。它会导致chunk()
返回 false 本身,但您的示例代码并不关心chunk()
无论如何返回,所以没关系。
Another option, assuming you're using sequential ids, would be to get the ending id
and then add a where clause to your chunked query to get all the records with an id
less than your max id
. So, something like:
另一个选择,假设您使用的是顺序 id,将获得结尾id
,然后在分块查询中添加一个 where 子句,以获取id
小于 max 的所有记录id
。所以,像这样:
$max = 1000000;
$maxId = Lists::whereReady(true)->skip($max)->take(1)->value('id');
$lists = Lists::whereReady(true)->where('id', '<', $maxId);
$lists->chunk(1000, function (Collection $lists) {
foreach ($lists as $list) {
// ...do stuff
}
});
Code is slightly cleaner, but it is still a hack, and requires one extra query (to get the max id).
代码稍微干净一点,但它仍然是一个黑客,并且需要一个额外的查询(以获得最大 id)。