postgreSQL 同时将列类型从 int 更改为 bigint
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/33504982/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
postgreSQL concurrently change column type from int to bigint
提问by Pierre Michard
I have a pretty big table (around 1 billion rows), and I need to update the id type from SERIAL
to BIGSERIAL
; guess why?:).
我有一个非常大的表(大约 10 亿行),我需要将 id 类型从 更新SERIAL
为BIGSERIAL
; 猜猜为什么?:)。
Basically this could be done with this command:
基本上这可以用这个命令来完成:
execute "ALTER TABLE my_table ALTER COLUMN id SET DATA TYPE bigint"
Nevertheless that would lock my table forever and put my web service down.
尽管如此,这会永远锁定我的桌子并关闭我的网络服务。
Is there a quite simple way of doing this operation concurrently (whatever the time it will take)?
是否有一种非常简单的方法可以同时执行此操作(无论需要花费多少时间)?
采纳答案by Radek Posto?owicz
If you don't have foreign keys pointing your id you could add new column, fill it, drop old one and rename new to old:
如果您没有指向您的 id 的外键,您可以添加新列,填充它,删除旧列并将新列重命名为旧:
alter table my_table add column new_id bigint;
begin; update my_table set new_id = id where id between 0 and 100000; commit;
begin; update my_table set new_id = id where id between 100001 and 200000; commit;
begin; update my_table set new_id = id where id between 200001 and 300000; commit;
begin; update my_table set new_id = id where id between 300001 and 400000; commit;
...
create unique index my_table_pk_idx on my_table(new_id);
begin;
alter table my_table drop constraint my_table_pk;
alter table my_table alter column new_id set default nextval('my_table_id_seq'::regclass);
update my_table set new_id = id where new_id is null;
alter table my_table add constraint my_table_pk primary key using index my_table_pk_idx;
alter table my_table drop column id;
alter table my_table rename column new_id to id;
commit;