Mysql 插入查询返回 ERROR 1062 (23000): Duplicate entry '2147483647' for key 'PRIMARY'
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/18643648/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Mysql insert query returns ERROR 1062 (23000): Duplicate entry '2147483647' for key 'PRIMARY'
提问by UserK
I've noticed an error during an insert query in my database.
我注意到在我的数据库中执行插入查询时出错。
mysql> insert into users (name) values ('Gepp');
returned:
回来:
ERROR 1062 (23000): Duplicate entry '2147483647' for key 'PRIMARY'
错误 1062 (23000): 键 'PRIMARY' 的重复条目 '2147483647'
It's the first time I get this error maybe this suggests that some kind of limit has been reached. Anyway I've checked in other posts complaining for the same error and found out that triggers may be the problem. Unfortunately it's not the case:
这是我第一次收到此错误,这可能表明已达到某种限制。无论如何,我已经查看了其他抱怨相同错误的帖子,并发现触发器可能是问题所在。不幸的是,情况并非如此:
mysql> SHOW triggers;
Empty set (0.00 sec)
EDITThe structure of my users table is shown below:
编辑我的用户表的结构如下所示:
> *************************** 1. row ***************************
Field: uid
Type: int(11)
Null: NO
Key: PRI
Default: NULL
Extra: auto_increment
*************************** 2. row ***************************
Field: name
Type: varchar(50)
Null: NO
Key:
Default: NULL
Extra:
*************************** 3. row ***************************
Field: email
Type: varchar(100)
Null: NO
Key: UNI
Default: NULL
Extra:
*************************** 4. row ***************************
Field: encrypted_password
Type: varchar(80)
Null: NO
Key:
Default: NULL
Extra:
*************************** 5. row ***************************
Field: salt
Type: varchar(10)
Null: NO
Key:
Default: NULL
Extra:
*************************** 6. row ***************************
Field: descrizione
Type: varchar(600)
Null: YES
Key:
Default: NULL
Extra:
*************************** 7. row ***************************
Field: motto
Type: varchar(100)
Null: NO
Key:
Default:
Extra:
*************************** 8. row ***************************
Field: status
Type: varchar(100)
Null: NO
Key:
Default: Hey new gambler! Share your thoughts!
Extra:
*************************** 9. row ***************************
Field: game
Type: varchar(100)
Null: NO
Key:
Default:
Extra:
*************************** 10. row ***************************
Field: pokeroom
Type: varchar(100)
Null: NO
Key:
Default:
Extra:
*************************** 11. row ***************************
Field: score
Type: int(11)
Null: NO
Key:
Default: 0
Extra:
*************************** 12. row ***************************
Field: created_at
Type: datetime
Null: YES
Key:
Default: NULL
Extra:
*************************** 13. row ***************************
Field: updated_at
Type: datetime
Null: YES
Key:
Default: NULL
Extra:
*************************** 14. row ***************************
Field: photo
Type: varchar(500)
Null: NO
Key:
Default:
Extra:
*************************** 15. row ***************************
Field: panorama
Type: varchar(500)
Null: NO
Key:
Default:
Extra:
How can I solve this problem?
我怎么解决这个问题?
回答by UserK
Thank you guys for your help! There was a bad configuration of the table. The uid column had the primary key and the auto_increment attribute but in the project I'm working on users were created with a query like this:
谢谢你们的帮助!表的配置不正确。uid 列有主键和 auto_increment 属性,但在我正在处理的项目中,用户是用这样的查询创建的:
INSERT INTO users(uid, name, email, encrypted_password, salt, created_at) VALUES('12342354355.54534543','bollo','sai','dsfsd','sdsdf','23')
The uid was generated by the PHP function uniqid("",true)
and this caused the problem
uid 是由 PHP 函数生成的uniqid("",true)
,这导致了问题
select uid,id from users;
+------------+----+
| uid | id |
+------------+----+
| 183 | 1 |
| 5224 | 2 |
| 5228 | 3 |
| 52288 | 4 |
| 515620 | 5 |
| 519030 | 6 |
| 5156147 | 8 |
| 5156151 | 9 |
| 5156205 | 10 |
| 5157726 | 11 |
| 52289002 | 12 |
| 515615576 | 13 |
| 2147483647 | 14 |
+------------+----+
15 rows in set (0.00 sec)
As you can see a new uid, created by a query like the one above, was always greater than the previous one. Probably the auto_increment accepted only uid value greater than the last value inserted. I have been lucky for 14 registrations and then the uid value exceeded the maximum allowed by the definition of the column and caused the error.
如您所见,由上述查询创建的新 uid 始终大于前一个。可能 auto_increment 只接受大于插入的最后一个值的 uid 值。我很幸运,注册了 14 次,然后 uid 值超过了列定义允许的最大值并导致错误。
I've solved the problem by removing the Primary_Key from the uid columm:
我通过从 uid 列中删除 Primary_Key 解决了这个问题:
alter table users drop primary key;
modified it again to remove the auto_increment attribute:
再次修改它以删除 auto_increment 属性:
alter table users modify uid varchar(40) not null unique;
and finally added a new column called id in order to track and count users registrations:
最后添加了一个名为 id 的新列,以跟踪和统计用户注册:
alter table users add id int(11) not null auto_increment primary key;
In the end the error was caused by a bad organization of the database and the functions acting on it. My fault!
最后,错误是由数据库和作用于它的函数的错误组织引起的。我的错!
回答by Wayne Roddy
I was getting a very similar duplicate PRIMARY value for the index when running a large script to input 1,000 of rows.
在运行大型脚本以输入 1,000 行时,我得到了一个非常相似的重复 PRIMARY 值。
I dropped the primary index key, then ran my script, then re-enabled my "id" column as primary and now everything works fine.
我删除了主索引键,然后运行了我的脚本,然后将我的“id”列重新启用为主,现在一切正常。
回答by John Cen
Recently I solved the similar issue, error code:1062 duplicate entry. From this page how to solve mysql error code:1062 duplicate key?I found that, this error is because of the primary key field data type reached its upper limit, also changing the data type from int to bigint may helpful but changing the data type is depending on your requirement. There is a workaround in that page, i think it will help you to understand this issue better, thank you.
最近我解决了类似的问题,错误代码:1062 重复条目。从这个页面如何解决mysql错误代码:1062重复键?我发现,这个错误是因为主键字段数据类型达到了上限,将数据类型从 int 更改为 bigint 可能会有所帮助,但更改数据类型取决于您的要求。该页面中有一个解决方法,我认为它会帮助您更好地理解这个问题,谢谢。