Deploy Kafka cluster (Data Lake stack)
First published: Wednesday, May 28, 2025 | Last updated: Wednesday, May 28, 2025Deploy Kafka cluster (Data Lake stack) using the SloopStash Docker starter-kit.
Configure environment variables
Supported environment variables
# Allowed values for $ENVIRONMENT variable.
* dev
* qaa
* qab
Set environment variables
# Store environment variables.
$ export ENVIRONMENT=dev
Bootstrap Data Lake stack (Kafka cluster) environment
Docker
[!WARNING]
The Linux machine must have at least 2 GB RAM to avoid JVM memory pressure while running this 6-node Kafka cluster.
# Switch to Docker starter-kit directory.
$ cd /opt/kickstart-docker
# Provision OCI containers using Docker compose.
$ sudo docker compose -f compose/data-lake/kafka/main.yml --env-file compose/${ENVIRONMENT^^}.env -p sloopstash-${ENVIRONMENT}-data-lake-s2 up -d
Kafka
Verify Kafka cluster
# Access Bash shell of existing OCI container running Kafka controller node 1.
$ sudo docker container exec -ti sloopstash-${ENVIRONMENT}-data-lake-s2-kafka-controller-1-1 /bin/bash
# Switch to Kafka source directory.
$ cd /usr/local/lib/kafka
# Access Kafka metadata shell.
$ ./bin/kafka-metadata-shell.sh --snapshot /opt/kafka/data/__cluster_metadata-0/00000000000000000000.log
# List Kafka broker nodes.
>> ls brokers
>> exit
# Exit shell.
$ exit
Create Kafka topic
# Access Bash shell of existing OCI container running Kafka broker node 1.
$ sudo docker container exec -ti sloopstash-${ENVIRONMENT}-data-lake-s2-kafka-broker-1-1 /bin/bash
# Switch to Kafka source directory.
$ cd /usr/local/lib/kafka
# Create Kafka topic.
$ ./bin/kafka-topics.sh --create --topic sloopengine-product-update --if-not-exists --partitions 3 --replication-factor 2 --bootstrap-server 0.0.0.0:9092
# Exit shell.
$ exit
Write or publish message to Kafka topic
# Access Bash shell of existing OCI container running Kafka broker node 2.
$ sudo docker container exec -ti sloopstash-${ENVIRONMENT}-data-lake-s2-kafka-broker-2-1 /bin/bash
# Switch to Kafka source directory.
$ cd /usr/local/lib/kafka
# Write or publish message to Kafka topic.
$ ./bin/kafka-console-producer.sh --topic sloopengine-product-update --bootstrap-server 0.0.0.0:9092
> SloopEngine IDE v2.1.4 has been released.
> SloopEngine IDE protects your source code from developers.
# Exit shell.
$ exit
Read or stream message from Kafka topic
# Access Bash shell of existing OCI container running Kafka broker node 3.
$ sudo docker container exec -ti sloopstash-${ENVIRONMENT}-data-lake-s2-kafka-broker-3-1 /bin/bash
# Switch to Kafka source directory.
$ cd /usr/local/lib/kafka
# Read or stream message from Kafka topic.
$ ./bin/kafka-console-consumer.sh --topic sloopengine-product-update --from-beginning --bootstrap-server 0.0.0.0:9092
# Exit shell.
$ exit
Manage Data Lake stack (Kafka cluster) environments
Docker
# Switch to Docker starter-kit directory.
$ cd /opt/kickstart-docker
# Stop OCI containers using Docker compose.
$ sudo docker compose -f compose/data-lake/kafka/main.yml --env-file compose/${ENVIRONMENT^^}.env -p sloopstash-${ENVIRONMENT}-data-lake-s2 down
# Restart OCI containers using Docker compose.
$ sudo docker compose -f compose/data-lake/kafka/main.yml --env-file compose/${ENVIRONMENT^^}.env -p sloopstash-${ENVIRONMENT}-data-lake-s2 restart