Updated February 28, 2023
Introduction to Kafka Console Consumer
Kafka offers a command utility to view messages from the command line. Kafka offers the utility Kafka-console-consumer.sh, which helps to read messages from the command line topic. Kafka-console-consumer simply reads from a Kafka topic and writes data to console(standard output). By default, it outputs the raw bytes in the message with no formatting (using the Default Formatter).
How to Consume Data from Kafka using Kafka Console Consumer?
Command-line Argument
Following are the steps that need to be followed for consuming messages from the topic.
1. Starting Zookeeper
Kafka needs Zookeeper; it cannot start without Zookeeper. Start Zookeeper using the command zkServer.sh.
$ bin/zkServer.sh start
On your standard output, you will see as follows:
Zookeeper JMX enabled by default
Using config: /opt/zookeeper/bin/../conf/zoo.cfg
Starting zookeeper ... STARTED
Use the following command to connect to your local Zookeeper server:
bin/zkCli.sh -server 127.0.0.1:2181
2. Starting Kafka Server
To start the Kafka broker service, use the following command.
$ bin/kafka-server-start.sh-daemon
$ config/server.properties
After the Kafka broker is enabled, we can check that it functions by doing a few simple operations against the cluster, creating a test subject, generating some messages and consuming the same messages.
3. Create Topic
Before we can consume messages from the topic, we first need to create a Kafka topic and to do so; we will use the utility that Kafka provides to work on topics called Kafka-topics.sh.
Let’s create a topic called “myTopic” with a single partition and a single replica:
$bin/kafka-topics.sh --create --zookeeper localhost:2181 -- replication-factor 1 --partitions 1 --topic myTopic
The replication-factor determines how many copies of the data will be produced. As we’re working with a single instance, keep this value at 1. Set the partitions as the number of brokers you want your data to be divided between. As we’re running with a single broker, keep this value at 1.
4. Sending Messages to Kafka
The producer publishes a message to one or more Kafka topics. It is responsible for putting data in our Kafka. We will use the utility that Kafka provides to send messages to a topic using the command line. The utility is called Kafka-console- producer.sh. When you type any message in the terminal window, it goes directly to that specified topic while sending a message. Bydefault, Kafka finds a separate message in each line. Let’s start the producer, then type in a couple of console messages to deliver to the server.
$ bin/kafka-console-producer.sh --broker-list localhost:9092 -- topic myTopic
- Welcome to KafkaConsole
- This is myTopic
You can either exit this command or have this terminal run for more testing. Now open the Kafka consumer process to a new terminal on the next step.
5. Using Kafka Console Consumer
Consumers connect to different topics and read messages from brokers. They read data in consumer groups. Kafka provides a utility to read messages from topics by subscribing to it; the utility is called Kafka-console-consumer.sh.
Let’s run the consumer and consume all messages which the previous producer sent.
bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 -
-topic myTopic --from-beginning Welcome to Kafka Console
This is my Topic
Console
The above steps are shown in the console as follows:
1. Starting Zookeeper
Now let’s start the Zookeeper by running the following command.
$ bin / zkServer.sh start
Zookeeper JMX enabled by default
Using config: /opt/zookeeper/bin/../conf/zo .cfg
Starting zookeeper ... STARTED
Run the following command to connect the local Zookeeper server.
$ bin/zkCli.sh -server 127.0.0.1:2181
You may get a response with the CONNECTED name. It means you have a good installation of the local standalone Zookeeper.
Connecting to 127.0.0.1:2181
...
...
[zk: 127.0.0.1:2181(CONNECTED) 0]
2. Starting Kafka Server
Now go to your directory of Apache Kafka and execute the following command.
$ bin/kafka-server-start.sh -daemon config/server.properties
[2020-02-10 10:47:45,989] INFO Kafka version: 1.0.1
(org.apache.kafka.common.utils.AppInfoParser)
[2020-02-10 10:47:45,995] INFO Kafka commitId: c0518aa65f25317e
(org.apache.kafka.common.utils.AppInfoParser)
[2020-02-10 10:47:46,006] INFO [KafkaServer id=0] started (kafka.server.KafkaServer)
Your Apache Kafka is up and running.
3. Creating Kafka Topic
Kafka includes a file, Kafka-topics.sh in the < KAFKA HOME>/bin/directory to build a Kafka cluster topic.
Let’s create a topic with 6 partitions and 3 replication factor with the topic name as myTopic. Running the script produces a topic called myTopic with 3 replicates and 6 partitions preserving metadata live in the Zookeeper:2181
$bin/kafka-topics.sh --create --zookeeper localhost:2181 -- replication-factor 1 --partitions 1 --topic myTopic
$ bin/kafka-topics.sh --create --zookeeper localhost:2181 -- replication-factor 1 --partitions 1 --topic myTopic
Created topic "myTopic".
You will see the following output:
$ bin/kafka-topics.sh --list --zookeeper localhost:2181
myTopic
Now you can see the topic generated on Kafka by running the list topic command.
Instead, you can also customize the brokers to auto-create topics when a non-existent topic is released instead of generating topics manually.
4. Sending Messages to Kafka Topic
Let’s send messages to Kafka topic by starting producer using Kafka-console- producer.shutility. Run the following command to launch a Kafka producer use console interface to write the above sample topic.
$ bin/kafka-console-producer.sh --broker-list localhost:9092 -- topic testTopic
Welcome to kafka This is my topic
5. Using Kafka Console Consumer
Kafka provides a utility to read messages from topics by subscribing to it; the utility is called Kafka-console-consumer.sh.
Let’s run the consumer and consume all messages which the previous producer sent.
bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 -- topic testTopic --from-beginning
Welcome to kafka This is my topic
In a few cases, the Kafka-console-consumer method can be helpful by having as an autonomous user of particular topics. This can be useful in comparing the results to a consumer system you’ve created.
Conclusion
We have learned how to use Kafka console consumer and how to create Kafka topics and send messages using Kafka console producer and read it on consumer terminals. By operating the four components(zookeeper, broker, producer and consumer in separate terminals, you will insert messages from the producer’s terminal and see them displayed in the subscribing consumer terminal.
Recommended Articles
This is a guide to Kafka Console Consumer. Here we discuss the introduction and how consumers read or consume data from Kafka topics using Kafka console consumer? You may also have a look at the following articles to learn more –