Setting up Apache Kafka for local development with Docker

Many companies still struggle with huge amounts of incoming data and need to streamline the ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes, and many are doing this well.

One of the challenges when you have a lot of data is that it is costly to move too much data in-house, the data can get old too fast, or you have challenges keeping the development speed when you are connected directly to the Apache Kafka cluster in the cloud.

One way to deal with this problem can be to use Docker Desktop locally and run Apache Kafka on your machine inside a container. This will remove a lot of troubleshooting, increase your development speed, and save you money when you are developing your producer or consumer, so you have a fully isolated environment on your local machine.

First, we will create a single node Apache Kafka running on Docker using Docker Compose.

If you do not have Docker Desktop installed, you can go to Docker and get it, and then install it after it’s finished downloading.

version: "2"

services:
  kafka:
    image: "bitnami/kafka:latest"
    ports:
      - "9092:9092"
      - "29092:29092"
    volumes:
      - "kafka_data:/bitnami"
      - "kafka_log:/tmp/kafka_mounts/logs"
    environment:
      - KAFKA_CFG_NODE_ID=0
      - KAFKA_CFG_PROCESS_ROLES=controller,broker
      - KAFKA_CFG_LISTENERS=PLAINTEXT://:9092,PLAINTEXT_HOST://:29092,CONTROLLER://:9093
      - KAFKA_CFG_ADVERTISED_LISTENERS=PLAINTEXT://kafka:9092,PLAINTEXT_HOST://localhost:29092
      - KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP=CONTROLLER:PLAINTEXT,PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
      - KAFKA_CFG_CONTROLLER_QUORUM_VOTERS=0@kafka:9093
      - KAFKA_CFG_CONTROLLER_LISTENER_NAMES=CONTROLLER
      - KAFKA_CFG_AUTO_CREATE_TOPICS_ENABLE=true
      - ALLOW_PLAINTEXT_LISTENER=yes

volumes:
  kafka_log:
    driver: local
  kafka_data:
    driver: local

When it’s done, you can try opening Docker Desktop, going to containers, and searching for Kafka. Your screen should look like my screenshot.

Now, it’s time to download a GUI tool so you can inspect your Apache Kafka locally quickly to see topics and events quickly. I personally use Kadeck; it’s free up to 1 connection, so for developing, it’s working smooth.

Go to the Kadeck website and download and install the software.

Now, you can develop your own producers and consumers with your local version of Apache Kafka running inside Docker.


Related Courses

Apache Kafka for Beginners

BETA COURSES – EARLY BIRD
  • Apache Kafka Fundamentals
  • Setting Up a Kafka Cluster
  • Topics and Partitioning Strategies
1h
1
11

Leave a Reply

Your email address will not be published. Required fields are marked *