laravel 如何在laravel上插入大数据?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/51487769/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-14 17:55:45  来源:igfitidea点击:

How to insert big data on the laravel?

laravellaravel-5insertbigdatalaravel-5.6

提问by Success Man

I am using laravel 5.6

我正在使用 Laravel 5.6

My script to insert big data is like this :

我插入大数据的脚本是这样的:

...
$insert_data = [];
foreach ($json['value'] as $value) {
    $posting_date = Carbon::parse($value['Posting_Date']);
    $posting_date = $posting_date->format('Y-m-d');
    $data = [
        'item_no'                   => $value['Item_No'],
        'entry_no'                  => $value['Entry_No'], 
        'document_no'               => $value['Document_No'],
        'posting_date'              => $posting_date,
        ....
    ];
    $insert_data[] = $data;
}
\DB::table('items_details')->insert($insert_data);

I have tried to insert 100 record with the script, it works. It successfully insert data

我试图用脚本插入 100 条记录,它有效。它成功插入数据

But if I try to insert 50000 record with the script, it becomes very slow. I've waited about 10 minutes and it did not work. There exist error like this :

但是如果我尝试用脚本插入 50000 条记录,它会变得很慢。我已经等了大约 10 分钟,但没有奏效。存在这样的错误:

504 Gateway Time-out

How can I solve this problem?

我怎么解决这个问题?

回答by Vit Kos

As it was stated, chunks won't really help you in this case if it is a time execution problem. I think that bulk insert you are trying to use cannot handle that amount of data , so I see 2 options:

如前所述,如果是时间执行问题,块在这种情况下不会真正帮助您。我认为您尝试使用的批量插入无法处理那么多数据,所以我看到了 2 个选项:

1 - Reorganise your code to properly use chunks, this will look something like this:

1 - 重新组织您的代码以正确使用块,这将如下所示:

$insert_data = [];

foreach ($json['value'] as $value) {
    $posting_date = Carbon::parse($value['Posting_Date']);

    $posting_date = $posting_date->format('Y-m-d');

    $data = [
        'item_no'                   => $value['Item_No'],
        'entry_no'                  => $value['Entry_No'], 
        'document_no'               => $value['Document_No'],
        'posting_date'              => $posting_date,
        ....
    ];

    $insert_data[] = $data;
}

$insert_data = collect($insert_data); // Make a collection to use the chunk method

// it will chunk the dataset in smaller collections containing 500 values each. 
// Play with the value to get best result
$chunks = $insert_data->chunk(500);

foreach ($chunks as $chunk)
{
   \DB::table('items_details')->insert($chunk->toArray());
}

This way your bulk insert will contain less data, and be able to process it in a rather quick way.

这样,您的批量插入将包含更少的数据,并且能够以相当快的方式处理它。

2 - In case your host supports runtime overloads, you can add a directive right before the code starts to execute :

2 - 如果您的主机支持运行时重载,您可以在代码开始执行之前添加指令:

ini_set('max_execution_time', 120 ) ; // time in seconds

$insert_data = [];

foreach ($json['value'] as $value)
{
   ...
}

To read more go to the official docs

要阅读更多信息,请访问官方文档

回答by Anton Vlasenko

It makes no sense to use an array and then convert it to a collection.

使用数组然后将其转换为集合是没有意义的。

We can get rid of arrays.

我们可以摆脱数组。

$insert_data = collect();

foreach ($json['value'] as $value) {
    $posting_date = Carbon::parse($value['Posting_Date']);

    $posting_date = $posting_date->format('Y-m-d');

    $insert_data->push([
        'item_no'                   => $value['Item_No'],
        'entry_no'                  => $value['Entry_No'], 
        'document_no'               => $value['Document_No'],
        'posting_date'              => $posting_date,
        ....
    ]);
}

foreach ($insert_data->chunk(500) as $chunk)
{
   \DB::table('items_details')->insert($chunk->toArray());
}

回答by piyush anques

Here is very good and very Fast Insert data solution

这里有非常好的和非常快速的插入数据解决方案

$no_of_data = 1000000;
$test_data = array();
for ($i = 0; $i < $no_of_data; $i++){
    $test_data[$i]['number'] = "1234567890";
    $test_data[$i]['message'] = "Test Data";
    $test_data[$i]['status'] = "Delivered";
}
$chunk_data = array_chunk($test_data, 1000);
if (isset($chunk_data) && !empty($chunk_data)) {
   foreach ($chunk_data as $chunk_data_val) {
    DB::table('messages')->insert($chunk_data_val);
  }
}