postgresql - 整数超出范围
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/24308239/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
postgresql - integer out of range
提问by Torxed
Not the slightest idea why the hell this is happening..
一点也不知道为什么会发生这种情况..
I've set up a table accordingly:
我已经相应地设置了一张桌子:
CREATE TABLE raw (
id SERIAL,
regtime float NOT NULL,
time float NOT NULL,
source varchar(15),
sourceport INTEGER,
destination varchar(15),
destport INTEGER,
blocked boolean
); ... + index and grants
I've successfully used this table for a while now, and all of a sudden the following insert doesn't work any longer..
我已经成功地使用了这个表一段时间了,突然之间下面的插入不再起作用了..
INSERT INTO raw(
time, regtime, blocked, destport, sourceport, source, destination
) VALUES (
1403184512.2283964, 1403184662.118, False, 2, 3, '192.168.0.1', '192.168.0.2'
);
The error is: ERROR: integer out of range
错误是: ERROR: integer out of range
I mean comon... Not even sure where to begin debugging this.. I'm not out of disk-space and the error itself is kinda discreet..
我的意思是共同......甚至不确定从哪里开始调试这个......我没有用完磁盘空间而且错误本身有点谨慎......
回答by Nick Barnes
SERIAL
columns are stored as INTEGER
s, giving them a maximum value of 231-1. So after ~2 billion inserts, your new id
values will no longer fit.
SERIAL
列存储为INTEGER
s,使它们的最大值为 2 31-1。因此,在大约 20 亿次插入之后,您的新id
值将不再适合。
If you expect this many inserts over the life of your table, create it with a BIGSERIAL
(internally a BIGINT
, with a maximum of 263-1).
如果您希望在表的整个生命周期中有这么多的插入,请使用 a BIGSERIAL
(内部 a BIGINT
,最多为 2 63-1)创建它。
If you discover later on that a SERIAL
isn't big enough, you can increase the size of an existing field with:
如果您稍后发现 aSERIAL
不够大,您可以使用以下方法增加现有字段的大小:
ALTER TABLE raw ALTER COLUMN id TYPE BIGINT;
Note that it's BIGINT
here, rather than BIGSERIAL
(as serials aren't real types). And keep in mind that, if you actually have 2 billion records in your table, this might take a little while...
请注意,它在BIGINT
这里,而不是BIGSERIAL
(因为连续剧不是真正的类型)。请记住,如果您的表中实际上有 20 亿条记录,这可能需要一些时间......