java 如何在 Avro 中从 GenericRecord 转换为 SpecificRecord 以获得兼容的模式

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/33945383/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-11-02 22:20:04  来源:igfitidea点击:

How to convert from GenericRecord to SpecificRecord in Avro for compatible schemas

javaavro

提问by Mark D

Is the Avro SpecificRecord (i.e. the generated java classes) compatible with schema evolution? I.e. if I have a source of Avro messages (in my case, kafka) and I want to deserialize those messages to a specificrecord, is it possible to do safely?

Avro SpecificRecord(即生成的java 类)是否与模式演化兼容?即,如果我有 Avro 消息的来源(在我的情况下,kafka)并且我想将这些消息反序列化为特定记录,是否可以安全地进行?

What I see:

我所看到的:

  • adding a field to the end of a schema works fine - can deserialize ok to specificrecord
  • adding a field to the middle does not - i.e. breaks existing clients
  • 在模式的末尾添加一个字段工作正常 - 可以反序列化到特定记录
  • 在中间添加一个字段不会 - 即破坏现有客户端

Even if the messages are compatible, this is a problem.

即使消息兼容,这也是一个问题。

If I can find the new schema (using e.g. confluent schema registry) I can deserialize to GenericRecord, but there doesn't seem to be a way to map from genericrecord to specificrecord of different schema..

如果我能找到新模式(例如使用融合模式注册表),我可以反序列化为 GenericRecord,但似乎没有办法从 genericrecord 映射到不同模式的特定记录。

MySpecificType message = (T SpecificData.get().deepCopy(MySpecificType.SCHEMA$, genericMessage);

Deepcopy is mentioned in various places but it uses index so doesn't work..

很多地方都提到了 Deepcopy,但它使用索引,所以不起作用..

Is there any safe way to map between two avro objects when you have both schemas and they are compatible? Even if I could map from genercrecord to genericrecord this would do as I could then do the deepcopy trick to complete the job.

当您拥有两个架构并且它们兼容时,是否有任何安全的方法可以在两个 avro 对象之间进行映射?即使我可以从genercrecord映射到genericrecord,这也能像我那样做deepcopy技巧来完成这项工作。

采纳答案by JARC

There are example tests here for specific data type conversion. Its all in the configuration 'specificDeserializerProps'

这里有针对特定数据类型转换的示例测试。它全部在配置“specificDeserializerProps”中

https://github.com/confluentinc/schema-registry/blob/master/avro-serializer/src/test/java/io/confluent/kafka/serializers/KafkaAvroSerializerTest.java

https://github.com/confluentinc/schema-registry/blob/master/avro-serializer/src/test/java/io/confluent/kafka/serializers/KafkaAvroSerializerTest.java

I added the following config and got the specific type out as wanted.

我添加了以下配置并根据需要获取了特定类型。

HashMap<String, String> specificDeserializerProps = new HashMap<String, String>();
specificDeserializerProps.put(KafkaAvroDeserializerConfig.SCHEMA_REGISTRY_URL_CONFIG, "bogus");
specificDeserializerProps.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, "true");
specificAvroDeserializer = new KafkaAvroDeserializer(schemaRegistry, specificDeserializerProps);

Hope that helps

希望有帮助