PostgreSQL -> Oracle 复制

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/1007724/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-18 18:25:26  来源:igfitidea点击:

PostgreSQL -> Oracle replication

databaseoraclepostgresqlreplication

提问by

I'm looking for a tool to export data from a PostgreSQL DB to an Oracle data warehouse. I'm really looking for a heterogenous DB replication tool, rather than an export->convert->import solution.

我正在寻找一种将数据从 PostgreSQL 数据库导出到 Oracle 数据仓库的工具。我真的在寻找异构数据库复制工具,而不是导出->转换-​​>导入解决方案。

Continuent Tungsten Replicatorlooks like it would do the job, but PostgreSQL support won't be ready for another couple months.

连续的 Tungsten Replicator看起来可以完成这项工作,但 PostgreSQL 支持还要再过几个月才能准备好。

Are there any open-source tools out there that will do this? Or am I stuck with some kind of scheduled pg_dump/SQL*Loader solution?

是否有任何开源工具可以做到这一点?或者我是否坚持某种预定的 pg_dump/SQL*Loader 解决方案?

采纳答案by tuinstoel

You can create a database link from Oracle to Postgres (this is called heterogeneous connectivity). This makes it possible to select data from Postgres with a select statement in Oracle. You can use materialized views to schedule and store the results of those selects.

您可以创建从 Oracle 到 Postgres 的数据库链接(这称为异构连接)。这使得在 Oracle 中使用 select 语句从 Postgres 中选择数据成为可能。您可以使用物化视图来安排和存储这些选择的结果。

回答by chenson42

It sounds like SymmetricDSwould work for your scenario. SymmetricDS is web-enabled, database independent, data synchronization/replication software. It uses web and database technologies to replicate tables between relational databases in near real time.

听起来SymmetricDS适用于您的场景。SymmetricDS 是支持网络的、独立于数据库的数据同步/复制软件。它使用 Web 和数据库技术近乎实时地在关系数据库之间复制表。

回答by mancini0

Consider using the Confluent Kafka Connect JDBC sink and source connectors if you'd like to replicate data changes across heterogeneous databases in real time. The source connector can select the entire database , particular tables, or rows returned by a provided query, and send the data as a Kafka message to your Kafka broker. The source connector can calculate the diffs based on an incrementing id column, a timestamp column, or be run in bulk mode where the entire contents are recopied periodically. The sink can read these messages, optionally check them against an avro or json schema, and populate the source database with the results. It's all free, and several sink and source connectors exist for many relational and non-relational databases.

如果您希望跨异构数据库实时复制数据更改,请考虑使用 Confluent Kafka Connect JDBC 接收器和源连接器。源连接器可以选择整个数据库、特定表或提供的查询返回的行,并将数据作为 Kafka 消息发送到您的 Kafka 代理。源连接器可以根据递增的 id 列、时间戳列计算差异,或者以批量模式运行,其中定期重新复制整个内容。接收器可以读取这些消息,可选择根据 avro 或 json 模式检查它们,并用结果填充源数据库。这一切都是免费的,并且许多关系和非关系数据库都存在多个接收器和源连接器。

*One major caveat - Some JDBC Kafka connectors can not capture hard deletes

*一个主要警告 - 某些 JDBC Kafka 连接器无法捕获硬删除

To get around that limitation, you can use a propietary connector such as Debezium (http://www.debezium.io), see also Delete events from JDBC Kafka Connect Source.

要解决该限制,您可以使用专有连接器,例如 Debezium ( http://www.debezium.io),另请参阅 从 JDBC Kafka Connect Source 删除事件

回答by ProdDBA

Sounds like you want an ETL (extract transform load) tool. There are allot of open source options Enhydra Octopus, and Talend Open Studio are a couple I've come across. In general ETL tools offer you better flexibility than the straight across replication option. Some offer scheduling, data quality, and data lineage.

听起来您想要一个 ETL(提取变换负载)工具。有很多开源选项 Enhydra Octopus 和 Talend Open Studio 是我遇到的几个。一般来说,ETL 工具为您提供比直接复制选项更好的灵活性。有些提供计划、数据质量和数据沿袭。