I have a source topic and destination topic. Both topics replication and partition are 3 and 3 broker machines. I am reading the data from source topic and applying windowing function with 30sec window length and commit interval as 3 sec. Using Kstreams and Ktable in the Kafka.
After running the jar file I am getting duplicates records with the same timestamp. I am using the Apache NiFi consumer processor.
Which Kafka output topic is producing duplicate records? Is there any way we can restrict duplicate records?
question from:https://stackoverflow.com/questions/66066434/kafka-streams-records-duplicating-windowing-function