java Kafka错误反序列化分区的键/值
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/46113928/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Kafka error deserializing key/value for partition
提问by Lewis Watson
I've got an integration tests that passes when I send to a Kafka topic without a key. However, when I add a key I start to get serialization errors.
我有一个集成测试,当我在没有密钥的情况下发送到 Kafka 主题时,该测试通过了。但是,当我添加一个密钥时,我开始出现序列化错误。
org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition topic-1 at offset 0
Caused by: org.apache.kafka.common.errors.SerializationException: Size of data received by IntegerDeserializer is not 4
This is my sender class:
这是我的发件人类:
public class Sender {
private static final Logger LOG = LoggerFactory.getLogger(Sender.class);
@Autowired
private KafkaTemplate<String, String> kafkaTemplate;
public void send() {
String topic = "topic";
String data = "data";
String key = "key";
LOG.info("sending to topic: '{}', key: '{}', data: '{}'", topic, key, data);
// does not work
kafkaTemplate.send(topic, key, data);
// works
// kafkaTemplate.send(topic, data);
}
}
This is my configuration, where I specify a StringSerializer for the key
这是我的配置,我在其中为键指定了一个 StringSerializer
@Configuration
@EnableKafka
public class Config {
@Bean
public Sender sender() {
return new Sender();
}
@Bean
public Properties properties() {
return new Properties();
}
@Bean
public Map<String, Object> producerConfigs(Properties properties) {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, properties.getBootstrapServers());
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
return props;
}
@Bean
public ProducerFactory<String, String> producerFactory(Properties properties) {
return new DefaultKafkaProducerFactory<>(producerConfigs(properties));
}
@Bean
public KafkaTemplate<String, String> kafkaTemplate(Properties properties) {
return new KafkaTemplate<>(producerFactory(properties));
}
}
The problem may related to the message listener in my test, but that is also using strings across the board
问题可能与我测试中的消息侦听器有关,但这也是全面使用字符串
@RunWith(SpringRunner.class)
@SpringBootTest()
@DirtiesContext
public class SenderIT {
public static final Logger LOG = LoggerFactory.getLogger(SenderIT.class);
private static String SENDER_TOPIC = "topic";
@Autowired
private Sender sender;
private KafkaMessageListenerContainer<String, String> container;
private BlockingQueue<ConsumerRecord<String, String>> records;
@ClassRule
public static KafkaEmbedded embeddedKafka = new KafkaEmbedded(1, true, SENDER_TOPIC);
@Before
public void setUp() throws Exception {
// set up the Kafka consumer properties
Map<String, Object> consumerProperties =
KafkaTestUtils.consumerProps("sender", "false", embeddedKafka);
consumerProperties.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
// create a Kafka consumer factory
DefaultKafkaConsumerFactory<String, String> consumerFactory =
new DefaultKafkaConsumerFactory<String, String>(consumerProperties);
// set the topic that needs to be consumed
ContainerProperties containerProperties = new ContainerProperties(SENDER_TOPIC);
// create a Kafka MessageListenerContainer
container = new KafkaMessageListenerContainer<>(consumerFactory, containerProperties);
// create a thread safe queue to store the received message
records = new LinkedBlockingQueue<>();
// setup a Kafka message listener
container.setupMessageListener(new MessageListener<String, String>() {
@Override
public void onMessage(ConsumerRecord<String, String> record) {
LOG.debug("test-listener received message='{}'", record.toString());
records.add(record);
}
});
// start the container and underlying message listener
container.start();
// wait until the container has the required number of assigned partitions
ContainerTestUtils.waitForAssignment(container, embeddedKafka.getPartitionsPerTopic());
}
@After
public void tearDown() {
// stop the container
container.stop();
}
@Test
public void test() throws InterruptedException {
sender.send();
// check that the message was received in Kafka
ConsumerRecord<String, String> kafkaTopicMsg = records.poll(10, TimeUnit.SECONDS);
LOG.debug("kafka recieved = {}", kafkaTopicMsg);
assertThat(kafkaTopicMsg).isNotNull();
}
}
As always, any help would be appreciated.
与往常一样,任何帮助将不胜感激。
All the code to reproduce is available at https://github.com/LewisWatson/kafka-embedded-test/tree/8322621ad4e302d982e5ecd28af9fd314696d850
要复制的所有代码都可以在https://github.com/LewisWatson/kafka-embedded-test/tree/8322621ad4e302d982e5ecd28af9fd314696d850 获得
Full stack trace is available at https://travis-ci.org/LewisWatson/kafka-embedded-test/builds/273227986
完整的堆栈跟踪可在https://travis-ci.org/LewisWatson/kafka-embedded-test/builds/273227986 获得
回答by Lewis Watson
After further inspection of the logs I was able to narrow the problem down to the test message listener
在进一步检查日志后,我能够将问题缩小到测试消息侦听器
2017-09-08 09:30:06.845 ERROR 2550 --- [ -C-1] essageListenerContainer$ListenerConsumer : Container exception
org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition topic-1 at offset 0
Caused by: org.apache.kafka.common.errors.SerializationException: Size of data received by IntegerDeserializer is not 4
https://travis-ci.org/LewisWatson/kafka-embedded-test/builds/273227986#L2961
https://travis-ci.org/LewisWatson/kafka-embedded-test/builds/273227986#L2961
It looks like its expecting the key to be an integer for some reason.
出于某种原因,它似乎希望密钥是整数。
Explicitly setting string deserializers for the consumer factory fixed the problem.
为消费者工厂显式设置字符串解串器解决了这个问题。
// create a Kafka consumer factory
DefaultKafkaConsumerFactory<String, String> consumerFactory =
new DefaultKafkaConsumerFactory<String, String>(consumerProperties,
new StringDeserializer(), new StringDeserializer());