Kafka Cheat Sheet Shortlist

May 30, 2022 | DevOps | No Comments

When using Kafka, whether its a Kubernetes Operator-based (like Strimzi) or a common deployment, there are usually a set of common commands that are required for the basic day-to-day maintenance and operation of the Kafka cluster.

As a firs step you should download the Kafka distribution package. This package holds a set of Bash scripts that runs core (Java-based) Kafka components and enable interaction with the Kafka cluster (i.e creating topics, changing configs, etc.)

When running these Kafka commands, please note that they should be executed against the Zookeeper or the Bootstrap-Server depending on the Kafka version you are using.

Here is a list of commands which in my opinion are the most used and most necessary for the day-to-day operation of a Kafka cluster.


List All Topics
bin/kafka-topics.sh --list --zookeeper SERVER_ADDRESS:2181
bin/kafka-topics.sh --list --bootstrap-server SERVER_ADDRESS:9092


Show Topic Details
./bin/kafka-topics.sh --describe --bootstrap-server SERVER_ADDRESS:9092 --topic TOPIC_NAME


Create a Topic
./kafka-topics.sh --create --bootstrap-server SERVER_ADDRESS:9094 --topic TOPIC_NAME --partitions 2 --replication-factor 2


Change Topic Configuration Values

This command can be used for changing / adding configuration values to topics. This specific example shows how to change the retention ms value.

./bin/kafka-configs.sh --alter --bootstrap-server SERVER_ADDRESS:9092 --add-config retention.ms=1800000 --entity-type topics --entity-name TOPIC_NAME


Drain (purge) a Topic
./bin/kafka-configs.sh --alter --bootstrap-server SERVER_ADDRESS:9092 --add-config retention.ms=0 --entity-type topics --entity-name TOPIC_NAME


Run a Kafka Producer

The Producer process lets us add data to the topics in our Kafka cluster. When running this process, it will open a console for the producer, in which we can type inputs of data to the cluster. Each text input will be added as a separate message to the relevant topic.

bin/kafka-console-producer.sh --bootstrap-server SERVER_ADDRESS:9092 --topic TOPIC_NAME


Run a Kafka Consumer

The Consumer process lets us read data from the topics in our Kafka cluster. when using the from-begining flag we tell the Kafka cluster to let us read all the messages that are currently in the topic (even old messages that were not consumed yet).

When running this command, all read messages will be logged onto the console. If executed alongside a producer, any message that will be produced in the producer console will be logged to the console by the consumer process.

bin/kafka-console-consumer.sh --bootstrap-server SERVER_ADDRESS:9092 --topic TOPIC_NAME --from-beginning

About Author

Leave a Reply

Your email address will not be published.