Our producer application is designed like a real-time log producer, which produces a new record with random IP addresses also in which the producer runs every three seconds. You can add a few records in the IP_Log.log file and producing millions of unique records from those three records will be taken care of by the producer.
We have also enabled the auto-creation of topics, so you need not create a topic before running your producer application. You can change the topic name in the streaming.properties file, mentioned previously:
package com.packt.storm.producer;import com.packt.storm.reader.PropertyReader;import org.apache.kafka.clients.producer.KafkaProducer;import org.apache.kafka.clients.producer.ProducerRecord;import org.apache.kafka.clients.producer.RecordMetadata; ...