Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

I have a source topic and destination topic. Both topics replication and partition are 3 and 3 broker machines. I am reading the data from source topic and applying windowing function with 30sec window length and commit interval as 3 sec. Using Kstreams and Ktable in the Kafka.

After running the jar file I am getting duplicates records with the same timestamp. I am using the Apache NiFi consumer processor.

Which Kafka output topic is producing duplicate records? Is there any way we can restrict duplicate records?

question from:https://stackoverflow.com/questions/66066434/kafka-streams-records-duplicating-windowing-function

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
260 views
Welcome To Ask or Share your Answers For Others

1 Answer

Waitting for answers

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
...