Kafkacat The Cli For Kafka Kafka

Mar 27th, 2020 - written by Kimserey with .

Kafka is a widely used message broker platform. Althought very powerful, developping and testing applications that consume or produce Kafka messages can be really painful. In today’s post we will look at kafkacat, a command line tool that will make our life easier when interacting with Kafka.

Install kafkacat

Kafkacat can be install using apt-get or brew on macOS.

1
brew install kafkacat

Once installed we then have access to the CLI tool.

1
kafkacat -h

List Topics

For the rest of this example, I’ll be connecting to a Kafka broker that I am running locally under localhost:9092.

Listing current topic available on the broker can be achieved with:

1
2
3
4
5
6
7
8
9
10
$> kafkacat -L -b localhost:9092

Metadata for all topics (from broker 1001: localhost:9092/1001):
 1 brokers:
  broker 1001 at localhost:9092 (controller)
 2 topics:
  topic "message-topic" with 1 partitions:
    partition 0, leader 1001, replicas: 1001, isrs: 1001
  topic "json-topic" with 1 partitions:
    partition 0, leader 1001, replicas: 1001, isrs: 1001

-L for listing and -b for the broker address.

Start a Producer

A producer in Kafka is a program producing messages into a topic.

We can produce a message with kafkacat with:

1
kafkacat -P -b localhost:9092 -t mytopic

And start entering the message on the terminal.

Or we can specify the message from a file:

1
kafkacat -P -b localhost:9092 -t mytopic mymessage.json

-P is used to create a producer and -t specifies the topic on which to produce the message. When we don’t provide any further arguments, we will enter an interactive session that allow us to write messages in the shell. Otherwise we can provide a file for example here I have provided a Json file called mymessage.json.

Start a Consumer

Lastly we can consume a particular topic with:

1
kafkacat -C -b localhost:9092 -t mytopic

This will read the messages we produced earlier and wait for future messages.

If we wanted to consume till end we could use -e. Or if we wanted to consume a particular number of messages from the end, we could use -o 10 which would read the messages from offset 10, so all messages after position 10 included.

1
kafkacat -C -b localhost:9092 -t mytopic -e -o 10

Or we can look for event from the end with -N:

1
kafkacat -C -b localhost:9092 -t mytopic -e -o -10

-10 will select the 10 messages from the end.

Using -J we are able to consume the messages and print them with a json envelope:

1
2
3
4
5
6
$> kafkacat -C -b localhost:9092 -t message-topic -e -J         

{"topic":"message-topic","partition":0,"offset":0,"tstype":"create","ts":1580573567909,"key":null,"payload":"test"}
{"topic":"message-topic","partition":0,"offset":1,"tstype":"create","ts":1580573569101,"key":null,"payload":"test"}
{"topic":"message-topic","partition":0,"offset":2,"tstype":"create","ts":1580573570489,"key":null,"payload":"test"}
% Reached end of topic message-topic [0] at offset 3: exiting

We can see that the timestamps are in milliseconds, using that we are able query range as well using -o s@<ms> -o e@<ms> with s and e being start and end.

1
kafkacat -C -b localhost:9092 -t message-topic -e -o s@1580573567909 -o e@1580574580530

which would query messages from Sat Feb 01 2020 16:12:47 to Sat Feb 01 2020 16:29:40.

Lastly we are also able to specify a specific partition using -p <partition>. And that concludes today’s post!

Conclusion

Today we looked into how we could use kafkacat to quickly interact with a Kafka broker, creating a consumer and a producer. With just those simple tools in our toolbox it allows us to quickly test applications for development that either act as a consumer, or act as a producer. I hope you liked this post and I see you on the next one!

Kafka Posts

  1. Introduction to Kafkacat CLI for Kafka
  2. Local Kafka Docker Setup
  3. Kafka Consumer and Producer in dotnet
  4. Kafka Schema Registry with Avro
  5. Kafka Topics, Partitions and Consumer Groups
  6. Kafka Offsets and Statistics
  7. Kafka Log Compaction

External Sources

Designed, built and maintained by Kimserey Lam.