如何使用java(spring)将json对象消息生成到kafka主题中?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/50198156/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-10 23:29:15  来源:igfitidea点击:

How to produce a json object message into kafka topic using java(spring)?

javajsonmavenapache-kafkakafka-producer-api

提问by guguli

I want to produce a message into kafka topic. That message should have this pattern:

我想在 kafka 主题中生成一条消息。该消息应具有以下模式:

   {"targetFileInfo":{"path":"2018-05-07-10/row01-small-01.txt.ready"}}

I know that is a json pattern, so how can i convert that json in String?

我知道这是一个 json 模式,那么我怎样才能将该 json 转换为字符串呢?

I use a maven project, so which dependencies are needed to use

我使用了一个maven项目,所以需要使用哪些依赖项

 String stringData = JSON.stringify({"targetFileInfo":{"path":"2018-05-07-10/row01-small-01.txt.ready"}});

So I think it is better don't convert Json to string and send indeed that massage into kafka topic.

所以我认为最好不要将 Json 转换为字符串并将该消息确实发送到 kafka 主题中。

My Code is like that, it can send a String but i don't know how i can modify my code to send the massage above. maybe you can help me.

我的代码就是这样,它可以发送一个字符串,但我不知道如何修改我的代码以发送上面的按摩。也许你可以帮助我。

 Producer<String, String> producer = null;

    Properties props = new Properties();
    props.put("bootstrap.servers", "localhost:9092");
    props.put("acks", "all");
    props.put("retries", 0);
    props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
    props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");

    producer = new KafkaProducer<>(props);
    String msg = "welcome";
    producer.send(new ProducerRecord<String, String>("event", msg));

    producer.close();

回答by Pratapi Hemant Patel

As per the comment you need to send JsonNodeas message on kafka. Write a custom Serializer / Deserializer for the same.

根据评论,您需要JsonNode在 kafka 上作为消息发送。为此编写自定义序列化器/反序列化器。

import java.io.IOException;
import java.util.Map;

import org.apache.kafka.common.serialization.Deserializer;
import org.apache.kafka.common.serialization.Serializer;

import com.fasterxml.Hymanson.core.JsonProcessingException;
import com.fasterxml.Hymanson.databind.JsonNode;
import com.fasterxml.Hymanson.databind.ObjectMapper;

public class JsonNodeSerDes implements Serializer<JsonNode>, Deserializer<JsonNode> {

    private ObjectMapper mapper = new ObjectMapper();

    @Override
    public byte[] serialize(String topic, JsonNode data) {

        try {
            return mapper.writeValueAsBytes(data);
        } catch (JsonProcessingException e) {
            return new byte[0];
        }
    }

    @Override
    public JsonNode deserialize(String topic, byte[] data) {

        try {
            return mapper.readValue(data, JsonNode.class);
        } catch (IOException e) {
            return null;
        }
    }

    @Override
    public void configure(Map<String, ?> configs, boolean isKey) {
    }

    @Override
    public void close() {
    }
}

I wrote serializer / deserializer in the same class. You can separate them in two class (one implementing Serializer, another implementing Deserializer).

我在同一个类中编写了序列化器/反序列化器。您可以将它们分成两个类(一个实现Serializer,另一个实现Deserializer)。

While creating KafkaProduceryou need to provide "value.serializer"config and "value.deserializer"config for KafkaConsumer.

在创建KafkaProducer你需要提供“value.serializer”配置和“value.deserializer”的配置KafkaConsumer

External Dependencies used:

使用的外部依赖:

<dependency>
  <groupId>com.fasterxml.Hymanson.core</groupId>
  <artifactId>Hymanson-databind</artifactId>
  <version>2.8.8</version>
</dependency>

回答by guguli

That solved my problem:

这解决了我的问题:

 Producer<String, String> producer = null;

    Properties props = new Properties();
    props.put("bootstrap.servers", "localhost:9092");
    props.put("acks", "all");
    props.put("retries", 0);
    props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
    props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");

    producer = new KafkaProducer<>(props);

    try {
        producer = new KafkaProducer<String, String>(props);
    } catch (Exception e) {
        e.printStackTrace();
    }
    blobStorageChecker = new BlobStorageChecker();
    String folder = blobStorageChecker.getCurrentDateUTC();
    String msg = "{\"targetFileInfo\":{\"path\":\"test/"+folder+"row01-small.txt\"},\"sourceFileInfo\":{\"lastModifiedTime\":1525437960000,\"file\":\"/row01-small-01.txt\",\"filename\":\"/data/row01/row01-small.txt\",\"size\":19728,\"remoteUri\":\"ftp://waws-prod-am2-191.ftp.net/data/orsted-real/inbound/row01\",\"contentEncoding\":\"\",\"contentType\":\"\"}}";
    ProducerRecord<String, String> record = new ProducerRecord<String, String>("event-orsted-v1", null, msg);
    if (producer != null) {
        try {
            Future<RecordMetadata> future = producer.send(record);
            RecordMetadata metadata = future.get();
        } catch (Exception e) {
            System.err.println(e.getMessage());
            e.printStackTrace();
        }
    }
    producer.close();