在 MySQL 中导入 50K+ 记录给出一般错误:1390 Prepared statement contains too many placeholders
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/18100782/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Import of 50K+ Records in MySQL Gives General error: 1390 Prepared statement contains too many placeholders
提问by Gareth Daine
Has anyone ever come across this error: General error: 1390 Prepared statement contains too many placeholders
有没有人遇到过这个错误:General error: 1390 Prepared statement contains too many placeholders
I just did an import via SequelPro of over 50,000 records and now when I go to view these records in my view (Laravel 4) I get General error: 1390 Prepared statement contains too many placeholders.
我刚刚通过 SequelPro 导入了超过 50,000 条记录,现在当我在我的视图 (Laravel 4) 中查看这些记录时,我收到一般错误:1390 Prepared statement contains too many placeholders。
The below index() method in my AdminNotesController.php file is what is generating the query and rendering the view.
我的 AdminNotesController.php 文件中的以下 index() 方法用于生成查询和呈现视图。
public function index()
{
$created_at_value = Input::get('created_at_value');
$note_types_value = Input::get('note_types_value');
$contact_names_value = Input::get('contact_names_value');
$user_names_value = Input::get('user_names_value');
$account_managers_value = Input::get('account_managers_value');
if (is_null($created_at_value)) $created_at_value = DB::table('notes')->lists('created_at');
if (is_null($note_types_value)) $note_types_value = DB::table('note_types')->lists('type');
if (is_null($contact_names_value)) $contact_names_value = DB::table('contacts')->select(DB::raw('CONCAT(first_name," ",last_name) as cname'))->lists('cname');
if (is_null($user_names_value)) $user_names_value = DB::table('users')->select(DB::raw('CONCAT(first_name," ",last_name) as uname'))->lists('uname');
// In the view, there is a dropdown box, that allows the user to select the amount of records to show per page. Retrieve that value or set a default.
$perPage = Input::get('perPage', 10);
// This code retrieves the order from the session that has been selected by the user by clicking on a table column title. The value is placed in the session via the getOrder() method and is used later in the Eloquent query and joins.
$order = Session::get('account.order', 'company_name.asc');
$order = explode('.', $order);
$notes_query = Note::leftJoin('note_types', 'note_types.id', '=', 'notes.note_type_id')
->leftJoin('users', 'users.id', '=', 'notes.user_id')
->leftJoin('contacts', 'contacts.id', '=', 'notes.contact_id')
->orderBy($order[0], $order[1])
->select(array('notes.*', DB::raw('notes.id as nid')));
if (!empty($created_at_value)) $notes_query = $notes_query->whereIn('notes.created_at', $created_at_value);
$notes = $notes_query->whereIn('note_types.type', $note_types_value)
->whereIn(DB::raw('CONCAT(contacts.first_name," ",contacts.last_name)'), $contact_names_value)
->whereIn(DB::raw('CONCAT(users.first_name," ",users.last_name)'), $user_names_value)
->paginate($perPage)->appends(array('created_at_value' => Input::get('created_at_value'), 'note_types_value' => Input::get('note_types_value'), 'contact_names_value' => Input::get('contact_names_value'), 'user_names_value' => Input::get('user_names_value')));
$notes_trash = Note::onlyTrashed()
->leftJoin('note_types', 'note_types.id', '=', 'notes.note_type_id')
->leftJoin('users', 'users.id', '=', 'notes.user_id')
->leftJoin('contacts', 'contacts.id', '=', 'notes.contact_id')
->orderBy($order[0], $order[1])
->select(array('notes.*', DB::raw('notes.id as nid')))
->get();
$this->layout->content = View::make('admin.notes.index', array(
'notes' => $notes,
'created_at' => DB::table('notes')->lists('created_at', 'created_at'),
'note_types' => DB::table('note_types')->lists('type', 'type'),
'contacts' => DB::table('contacts')->select(DB::raw('CONCAT(first_name," ",last_name) as cname'))->lists('cname', 'cname'),
'accounts' => Account::lists('company_name', 'company_name'),
'users' => DB::table('users')->select(DB::raw('CONCAT(first_name," ",last_name) as uname'))->lists('uname', 'uname'),
'notes_trash' => $notes_trash,
'perPage' => $perPage
));
}
Any advice would be appreciated. Thanks.
任何意见,将不胜感激。谢谢。
回答by Martin
There is limit 65,535(2^16-1) place holders in MariaDB 5.5which is supposed to have identical behaviour as MySQL 5.5.
MariaDB 5.5 中的占位符限制为 65,535(2^16-1) 个,它应该与MySQL 5.5具有相同的行为。
Not sure if relevant, I tested it on PHP 5.5.12 using MySQLi / MySQLND.
不确定是否相关,我使用 MySQLi / MySQLND 在 PHP 5.5.12 上对其进行了测试。
回答by Faridul Khan
Solved this issue by using array_chunk function.
使用 array_chunk 函数解决了这个问题。
Here is the solution below:
以下是解决方案:
foreach (array_chunk($data,1000) as $t)
{
DB::table('table_name')->insert($t);
}
回答by Gabe Spradlin
While I think @The Disintegrator is correct about the placeholders being limited. I would not run 1 query per record.
虽然我认为@The Disintegrator 对占位符的限制是正确的。我不会每条记录运行 1 个查询。
I have a query that worked fine until I added one more column and now I have 72k placeholders and I get this error. However, that 72k is made up of 9000 rows with 8 columns. Running this query 1 record at a time would take days. (I'm trying to import AdWords data into a DB and it would literally take more than 24 hours to import a days worth of data if I did it 1 record at a time. I tried that first.)
我有一个查询运行良好,直到我再添加一列,现在我有 72k 占位符,但出现此错误。但是,72k 由 9000 行和 8 列组成。一次运行此查询 1 条记录需要数天时间。(我正在尝试将 AdWords 数据导入数据库,如果我一次执行 1 条记录,导入一天的数据实际上需要超过 24 小时。我首先尝试过。)
What I would recommend is something of a hack. First either dynamically determine the max number of placeholders you want to allow - i.e. 60k to be safe. Use this number to determine, based on the number of columns, how many complete records you can import/return at once. Create the full array of data for you query. Use a array_chunk and a foreach loop to grab everything you want in the minimum number of queries. Like this:
我会推荐的是一些黑客。首先要么动态确定您想要允许的占位符的最大数量 - 即 60k 是安全的。使用此数字可根据列数确定一次可以导入/返回的完整记录数。为您的查询创建完整的数据数组。使用 array_chunk 和 foreach 循环以最少的查询次数获取您想要的所有内容。像这样:
$maxRecords = 1000;
$sql = 'SELECT * FROM ...';
$qMarks = array_fill(0, $maxInsert, '(?, ...)');
$tmp = $sql . $implode(', ', $qMarks);
foreach (array_chunk($data, $maxRecords) AS $junk=>$dataArray) {
if (count($dataArray) < $maxRecords)) { break; }
// Do your PDO stuff here using $tmp as you SQL statement with all those placeholders - the ?s
}
// Now insert all the leftovers with basically the same code as above except accounting for
// the fact that you have fewer than $maxRecords now.
回答by Mike
This error onlyhappens when bothof the following conditions are met:
仅当满足以下两个条件时才会发生此错误:
- You are using the MySQL Native Driver (mysqlnd)and not the MySQL client library (libmysqlclient)
- You are notemulating prepares.
- 您使用的是MySQL 本机驱动程序 (mysqlnd)而不是 MySQL 客户端库 (libmysqlclient)
- 你不是在模仿准备。
If you change either one of these factors, this error will not occur. However keep in mind that doing both of these is recommended either for performance or security issues, so I would not recommend this solution for anything but more of a one-time or temporary problem you are having. To prevent this error from occurring, the fix is as simple as:
如果您更改这些因素之一,则不会发生此错误。但是请记住,出于性能或安全问题,建议同时执行这两项操作,因此除了您遇到的一次性或临时问题之外,我不会推荐此解决方案。为防止发生此错误,修复方法非常简单:
$dbh->setAttribute(PDO::ATTR_EMULATE_PREPARES, true);
回答by Wojtek Jarz?cki
Using Laravel model, copy all 11000 records from sqlite database to mysql database in few seconds. Chunk data array to 500 records:
使用 Laravel 模型,在几秒钟内将所有 11000 条记录从 sqlite 数据库复制到 mysql 数据库。将数据数组分块为 500 条记录:
public function handle(): void
{
$smodel = new Src_model();
$smodel->setTable($this->argument('fromtable'));
$smodel->setConnection('default'); // sqlite database
$src = $smodel::all()->toArray();
$dmodel = new Dst_model();
$dmodel->setTable($this->argument('totable'));
$dmodel->timestamps = false;
$stack = $dmodel->getFields();
$fields = array_shift($stack);
$condb = DB::connection('mysql');
$condb->beginTransaction();
$dmodel::query()->truncate();
$dmodel->fillable($stack);
$srcarr=array_chunk($src,500);
$isOK=true;
foreach($srcarr as $item) {
if (!$dmodel->query()->insert($item)) $isOK=false;
}
if ($isOK) {
$this->notify("Przenie?li?my tabel? z tabeli : {$this->argument('fromtable')} do tabeli: {$this->argument('totable')}", 'B?dzie ?wie?a jak nigdy!');
$condb->commit();
}
else $condb->rollBack();
}
回答by Farid Abbas
My Fix for above issue: On my side when i got this error I fixed it by reducing the the bulk insertion chunk size from 1000 to 800 and it worked for me. Actually there were too many fields in my table and most them contains the details descriptions of size like a complete page text. when i go for there bulk insertion the service caused crashed and through the above error.
我对上述问题的修复:在我这边,当我收到此错误时,我通过将批量插入块大小从 1000 减少到 800 来修复它,它对我有用。实际上,我的表格中有太多字段,并且大多数字段都包含大小的详细说明,就像完整的页面文本一样。当我去那里批量插入时,导致服务崩溃并通过上述错误。
回答by The Disintegrator
I think the number of placeholders is limited to 65536 per query (at least in older mysql versions).
我认为占位符的数量限制为每个查询 65536 个(至少在较旧的 mysql 版本中)。
I really can't discern what this piece of code is generating. But if it's a gigantic query, There's your problem.
我真的无法辨别这段代码正在生成什么。但如果这是一个巨大的查询,那就是你的问题。
You should generate one query per record to import and put those into a transaction.
您应该为每条记录生成一个查询以将其导入并将其放入事务中。