How to run a multi-node Kafka cluster in Local?

  1. Download Kafka Confluent Community Edition
  1. Unzip the downloaded file
  1. To run 3 clusters go to confluent-6.0.0/etc/kafka directory
  1. Copy 3 times and rename it as
  1. Next step is to change the configuration inside these property files. Properties to be changed are 3
    1. Broker id: Every cluster should have a unique broker id.Assign a unique id for this configuration, example
      • # <<For>>
      • # <<For>>
      • # <<For>>
    2. Listener port: This is the port at which producer will send data to Kafka broker and consumer will listen to this port for incoming messages. Therefore, in a single machine this should be unique for each clusterUncomment below configuration and assign a port number, example
      • listeners=PLAINTEXT://:9092 # <<For>>
      • listeners=PLAINTEXT://:9093 # <<For>>
      • listeners=PLAINTEXT://:9094 # <<For>>
    3. Log file Directory: This is the location to store the data for each of the cluster
      • Assign a unique directory for each broker , by modifying below config
        • log.dirs=/tmp/kafka-logs-1 # <<For>>
        • log.dirs=/tmp/kafka-logs-2 # <<For>>
        • log.dirs=/tmp/kafka-logs-3 # <<For>>
  1. Now Run zookeeper by giving below command. Change the path if you are doing this on windows
    • zookeeper-server-start ../etc/kafka/

  1. Open 3 terminal for each broker and start the broker by giving following command.
    • For Server 1 : kafka-server-start ../etc/kafka/
    • For Server 2 : kafka-server-start ../etc/kafka/
    • For Server 3 : kafka-server-start ../etc/kafka/

With this you will have a 3 node Kafka cluster running in your local.

Thanks !

2 thoughts on “How to run a multi-node Kafka cluster in Local?

Leave a Reply