Kafka Consumer Lag Command Line

LEARNING WITH lynda. Conclusion. I will also explain few things along the way, and this demo will provide a good sense of some command line tools that Kafka provides. The target audience would be the people who are willing to know about Apache Kafka, Zookeeper, Queues, Topics, Client - Server communication, Messaging system (Point to Point & Pub - Sub), Single node server, Multi node servers or Kafka cluster, command line producer and consumer, Producer application using Java API's and Consumer application. id “mygroup”, any other Kafka consumer actor with the same group. Can you please try with "--broker-list localhost:6667" ? broker seems to be running on port 6667. Usually I do this with the Kafka command line tools but I always forget the exact command to run which I have to look from different sources. Limiting the size of these files allows you to quickly diagnose problems if they occur. However, as the exporter doesn't use the official. Topic deletion is enabled by default in new Kafka versions ( from 1. home:6667 --topic topic_name --group-id consumer_group_id; The output of the command will be: consumer_group_id topic_name consumer_group_id 123. Run the producer and then type a few messages into the console to send to the server. However, doing the (complex) work of setting the arguments for the console consumer would lengthen the line of code, not expand it to multiple lines. kafka-shell. However, simply sending lines of text will result in messages with null keys. sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic topic-name Example. The purpose of writing this post is to illustrate…. 10 and later version is highly flexible and extensible, some of the features include: Enhanced configuration API. In a healthy Kafka cluster, all producers are pushing messages into topics and all consumers are pulling those messages at the other end of the topics. KafkaConsumer¶ class kafka. Kafka Streams Upgrade System Tests 0100. I'm using kafka version 0. STORM-1136: Command line module to return kafka spout offsets lag and display in storm ui. Kafka Console Producer and Consumer Example - In this Kafka Tutorial, we shall learn to create a Kafka Producer and Kafka Consumer using console interface of Kafka. If you need to set Kafka consumer configuration that isn't supported by command line arguments, you can provided a standard Kafka consumer properties file: > prometheus-kafka-consumer-group-exporter --consumer-config consumer. Zalando has trialled: Burrow, which has performance issues, Kafka lag monitor, which relates more to storm,. To verify the port number on which kafka broker is running , get into zookeeper client shell. And we'll press Enter and we get the full documentation. /opt/kafka); ZK_HOSTS identifies running zookeeper ensemble, e. bin/kafka-console-producer. Check out our Kafka Quickstart Tutorial to get up and running quickly. com:9092 buffer. /kafka-console-consumer. To find the lag in milliseconds between the timestamp of the most recently published message in a stream, topic, or partition and the timestamp of a consumer's most recently committed cursor, run the command stream cursor list. Getting Started with Apache Kafka for the Baffled, Part 1 Jun 16 2015 in Programming. We can use this command for any of the required partition. kafkaconsumer Now run following command to just preview what will be the next offset if you reset. Kafka also has a powerful command that enables messages to be consumed from the command line. The rest of. To verify the port number on which kafka broker is running , get into zookeeper client shell. – devshawn Apr 7 at 16:36. If you need you can always create a new topic and write messages to that. For example: $ /usr/bin/kafka-consumer-offset-checker --group flume --topic t1 --zookeeper zk01. However, doing the (complex) work of setting the arguments for the console consumer would lengthen the line of code, not expand it to multiple lines. com:2181 --describe --group flume GROUP TOPIC PARTITION CURRENT-OFFSET LOG-END-OFFSET LAG OWNER flume t1 0 1 3 2 test-consumer-group_postamac. This allows users to easily see which topics have fewer than the minimum number of in-sync replicas. Messages should be one per line. To check Kafka's offset lag, use the following command: $. docker-kafkacat - Dockerized kafkacat - a generic command line non-JVM Apache Kafka producer and consumer #opensource. Run the command: $ kafka-console-consumer. The target audience would be the people who are willing to know about Apache Kafka, Zookeeper, Queues, Topics, Client - Server communication, Messaging system (Point to Point & Pub - Sub), Single node server, Multi node servers or Kafka cluster, command line producer and consumer, Producer application using Java API's and Consumer application. In order to send messages with both keys and values you must set the parse. com) is a place where you can learn Big Data technologies like Apache Hadoop, Apache Spark, Apache Kafka, NoSQL, etc. /opt/kafka); ZK_HOSTS identifies running zookeeper ensemble, e. Notice the --new-consumer and the Kafka's broker address, it does not need a Zookeeper address as before. Unfortunately I can't get any messages using KafkaConsumer and I can't find where is a problem. So, just before jumping head first and fully integrating with Apache Kafka, let's check the water and plan ahead for painless integration. The important part, for the purposes of demonstrating distributed tracing with Kafka and Jaeger, is that the example project makes use of a Kafka Stream (in the stream-app), a Kafka Consumer/Producer (in the consumer-app), and a Spring Kafka Consumer/Producer (in the spring-consumer-app). This course provides an introduction to Apache Kafka, including architecture, use cases for Kafka, topics and partitions, working with Kafka from the command line, producers and consumers, consumer groups, Kafka messaging order, creating producers and consumers using the Java API. With Kafka Connect, writing a file's content to a topic requires only a few simple steps. But in this approach, you need to remember to pass the additional parameters to the start command every time you start your server. As such I put my thinking cap on, and decided what I needed was a generic Kafka consumer, and a specialization of that, that simply dealt with the correct type of JSON message deserialization (like we did for the. When a topic contains JSON messages, Confluent users should view the messages by running kafka-console-consumer instead of kafka-avro-console-consumer. As you can see in the first chapter, Kafka Key Metrics to Monitor, the setup, tuning, and operations of Kafka require deep insights into performance metrics such as consumer lag, I/O utilization, garbage collection and many more. Once properties files are ready, then we can start the broker instances. Command Line Parsers; Group: Apache Kafka. To list the available topics in Kafka from command-line. Take a moment to look through the options. As explained in a previous post. These scripts read from STDIN and write to STDOUT and are frequently used to send and receive data via Kafka over the command line. Install additional stage libraries to use stages that are not included in the core RPM or core tarball installation of Data Collector. The Event Hubs for Kafka feature provides a protocol head on top of Azure Event Hubs that is binary compatible with Kafka versions 1. com:2181 --messages 50000000 --topic test --threads 1 server-config. id=(some name) property into it, and then running the consumer with the --consumer. where localhost:2181 is one or more of your Zookeeper instance hostnames and ports. Every event stored by kafka gets an offset (which is basically an ID, as offset is increased by 1 for every event). In particular, MirrorMaker does not replicate cursors or message positions, which makes disaster recovery much more difficult than with replication of MapR Event Store For Apache Kafka. Kafka also has a command to send messages through the command line; the input can be a text file or the console standard input. For example Kafka message broker details, group-id. class --options) Consumer Offset Checker. Kafka Tuning. Getting Started with Apache Kafka for the Baffled, Part 1 Jun 16 2015 in Programming. ConsumerOffsetChecker -----Original Message----- From: Harshvardhan Chauhan Sent: Friday, March 28, 2014 12:54 PM To: [email protected] Subject: Java API to monitor Consumer Offset and Lag Hi, I am trying to write a groovy script to get consumer offsets and lag for our kafka cluster. In the examples, you might need to add the extension according to your platform. Build an endpoint that we can pass in a message to be produced to Kafka. All of the command line. ConsumerOffsetChecker -----Original Message----- From: Harshvardhan Chauhan Sent: Friday, March 28, 2014 12:54 PM To: [email protected] Subject: Java API to monitor Consumer Offset and Lag Hi, I am trying to write a groovy script to get consumer offsets and lag for our kafka cluster. Conclusion. There are two parts to this question: 1. I reproduce them here with the command line client. Using command line args: kafka-consumer-lag --brokers kafka01. Service Checks The Kafka-consumer check does not include any service checks. Talk to people. February 13, 2017, at 7:27 PM kafka. Apache Kafka is a popular distributed message broker designed to efficiently handle large volumes of real-time data. This simulation test consists of 24 multiple choice questions and gives you the look and feel of the real Kafka certification exam. Each cluster is identified by *type* and *name*. As prerequisites we should have installed docker locally, as we will run the kafka cluster on our machine, and also the python packages spaCy and confluent_kafka -pip install spacy confluent_kafka. Why Docker? Deploying Kafka in Docker greatly simplifies deployment as we do not need to manually configure each broker individually! We can use single Docker Compose file to deploy Kafka to multiple server instances using Docker Swarm in a single command. Monitoring lag is important; without it we don't know where the consumer is relative to the size of the queue. Previously, only a few metrics like message rates were available in the RabbitMQ dashboard. This simulation test consists of 24 multiple choice questions and gives you the look and feel of the real Kafka certification exam. There are a few options. sh' shell, you will get the same result on the 'kafka-console-consumer. Apache Kafka has become the leading distributed data streaming enterprise big data technology. In this tutorial, you will install and use Apache Kafka 1. All of the command line. The Kafka producer and consumer can be coded in many. Kafka Command Line Tools¶ As mentioned in the installation procedure, we install kafka’s command line tools to all hosts in your cluster. And a special message type to identify cluster info - ClusterMetadata (read Kafka Admin Command Line Internals for details). Kafka comes with a command line client that will take input from standard input and send it out as messages to the Kafka. class --options) Consumer Offset Checker. 4) (October 17 2019) Lightbend Console enables you to monitor applications running on Kubernetes. Apache Kafka is a distributed messaging system that supports a pub/sub mechanism among other messaging models. A handy method for deciding how many partitions to use is to first calculate the throughput for a single producer (p) and a single consumer (c), and then use that with the desired throughput (t) to roughly estimate the number of partitions to use. Prerequisites. config: Lets you specify the name of a properties file that contains a set of Kafka consumer configurations. In addition to the traditional support for Kafka version 0. KAFKA-2061 added the --version flag to kafka-run-class. Apache Kafka: A Distributed Streaming Platform. Kafka Consumer. sh script, create a Kafka consumer that processes and displays messages fromTutorialTopic. In this way it is a perfect example to demonstrate how. it shows the position of Kafka consumer groups, including their lag. You can exit this command or keep this terminal running for further testing. We can do this by using the kafka-console-consumer. It can manage hundreds of metrics from all the components of Kafka (Broker, Producer and Consumer) to pinpoint consumer lag. 1 To add a reference to a dotnet core project, execute the following at the command line:. 11, although there may be performance issues due to changes in the protocol. So I have also decided to dive in it and understand it. bin/kafka-run-class. In fact Kafka ships with quite a few command line tools, (we spoke above of one of them: kafka-topics), and the two we use here are: kafka-console-consumer: reads data from a Kafka topic and writes the data to standard output. You can install the. By listing the Kafka Consumer groups, one can identify the consumer group related to the backup task and query for its lag to determine if the backup is finished. sh in the Kafka directory are the tools that help to create a Kafka Producer and Kafka Consumer respectively. It builds upon important stream processing concepts such as properly distinguishing between event time and processing time, windowing support, exactly-once processing semantics and simple yet efficient management of application state. Hello! I am trying to execute simple example with Ignite and KafkaConsumer. bin/kafka-run-class. Graphite Command Line / Script X X ü the most Click on LAG to sort on consumer lag across. This post isn’t about installing Kafka, or configuring your cluster, or anything like that. sh to create topics on the server. For example, you can use our command line tools to "tail". These are the principal requirements and also you will need to be sure that you have in you consumer. Apache Storm's integration with Kafka 0. It is part of the confluent suite. Create a Spring Kafka Kotlin Producer. Apache Kafka: A Distributed Streaming Platform. Kafka is a fast-streaming messaging service equipped with a command-line client that takes standard input and converts it before sending that input out as messages. This will bring up a list of parameters that the kafka-console-consumer can receive. For this post, we are going to cover a basic view of Apache Kafka and why I feel that it is a better optimized platform than Apache Tomcat. The first step is to start the Kafka and Zookeeper servers. Stack Exchange network consists of 175 Q&A communities including Stack Overflow, the largest, most trusted online community for developers to learn, share their knowledge, and build their careers. Most of my settings are the default. kafka-topics --zookeeper localhost:2181 --topic test --delete. bin/kafka-topics. Step 7: Create a consumer Kafka comes with command line consumer that show the message on console [code language="java"] #Receive message on consumer [[email protected] kafka_2. Open a new command prompt and move to directory C:/kafka_2. Objective: We will create a Kafka cluster with three Brokers and one Zookeeper service, one multi-partition and multi-replication Topic, one Producer console application that will post messages to the topic and one Consumer application to process the messages. STORM-1911: IClusterMetricsConsumer should use seconds to timestamp unit. 0 version this project is a complete rewrite based on the new spring-kafka project which uses the pure java Producer and Consumer clients provided by Kafka 0. properties file and putting a group. 11, although there may be performance issues due to changes in the protocol. I will also explain few things along the way, and this demo will provide a good sense of some command line tools that Kafka provides. My introduction to Kafka was rough, and I hit a lot of gotchas along the way. 10 and later version is highly flexible and extensible, some of the features include: Enhanced configuration API. 0 and above). Creating a topic from the command line is very easy to do. Therefore, it is important to monitor the Kafka service and restart the kafkaloader if and when the Kafka service is interrupted. Kafka is a publish-subscribe message queuing system that's designed like a distributed commit log. Kafka offers command-line tools to manage topics, consumer groups, to consume and publish messages and so forth. The equivalent commands to start every service in its own terminal, without using the CLI are: # Start ZooKeeper. To give you a guideline I have run one of the Kafka command line utility to send 400,000 messages and it is done in about 1. Kafka uses Zookeeper, which is a centralized service for maintaining configuration. Starting Kafka and Zookeeper. Part 2 is about collecting operational data from Kafka, and Part 3 details how to monitor Kafka with Datadog. We got through how to download Kafka distribution, start ZooKeeper, Apache Kafka server, send messages and receive messages. Apache Kafka: A Distributed Streaming Platform. These are the most commonly used Kafka commands for running producer and consumer from command line terminal. Apache Kafka doesn't support Prometheus metrics natively by default. Try Prime EN Hello, Sign in Account & Lists Sign in Account & Lists Orders Try Prime Cart Hello, Sign in Account & Lists Sign in Account. Kafka so that you will have a solid foundation to dive deep into different types of implementations and integrations for Kafka producers and consumers. sh to create topics on the server. Correct, you will see consumer group lag in kafka-consumer-groups. Creating a Kafka Topic − Kafka provides a command line utility named kafka-topics. In addition, Trifecta offers data import/export functions for transferring data between Kafka topics and many other Big Data Systems (including Cassandra, ElasticSearch, MongoDB and others). The Kafka consumer config parameters may also have an impact on the performance of the spout. It's time to do performance testing before asking developers to start the testing. Kafka shell allows you to configure a list of clusters, and properties such as --bootstrap-server and --zookeeper for the currently selected cluster will automatically be added when the command is run. You can use the Confluent command line interface (CLI) to install and administer a development Confluent Platform environment. As the producer, each line in the input is considered a message from the producer. Use kafka-consumer-groups. It can manage hundreds of metrics from all the components of Kafka (Broker, Producer and Consumer) to pinpoint consumer lag. log Listing messages from a topic bin/kafka-console-consumer. 9 based on the Kafka simple consumer, Apache Storm includes support for Kafka 0. Scripting Kafka To be fair, the command is short because I have simplified the Kafka console consumer in this LOC. Option 1 - Read values (without message keys) from Kafka topic with kafka-console-consumer. Skip to main content. In this first scenario, we will see how to manage offsets from command-line so it will give us an idea of how to implement it in our application. It is a publish/subscribe messaging system that has an interface typical of messaging systems but a storage layer more like a log-aggregation system and can be used for various activities, from monitoring (collection of metrics, or. Kafka Broker | Command-line Options and Procedure. This is the sixth post in this series where we go through the basics of using Kafka. - devshawn Apr 7 at 16:36. You use the consumer to see messages that are created by InfoSphere Information Server. Kafka and the ELK Stack — usually these two are part of the same architectural solution, Kafka acting as a buffer in front of Logstash to ensure resiliency. Step 7: Create a consumer Kafka comes with command line consumer that show the message on console [code language="java"] #Receive message on consumer [[email protected] kafka_2. Skip to main content. It's storing all data on disk. sh script in the bin directory. In this article, I will describe the log compacted topics in Kafka. The only difference - new Kafka Protocol metatype - MaybeOf ("?" in notation), when used means value is optional in message. May this info could help. Kafka shell allows you to configure a list of clusters, and properties such as --bootstrap-server and --zookeeper for the currently selected cluster will automatically be added when the command is run. id is generated using: console-consumer-${new Random(). 0 and later for both reading from and writing to Kafka topics. bin/kafka-topics. Once we have a topic, we can spin up a producer (line 5) and start producing messages. Multiple clusters of the same type should be listed in the same `type. From command-line client: Kafka has a command-line client for taking input from a particular file or standard input and pushing them as messages into the Kafka cluster. > bin/kafka-console-consumer. Using command line args: kafka-consumer-lag --brokers kafka01. The last step is how to read the generated messages. To populate Zookeeper, bring up at least one broker using these command line options:--hostname uses Docker templates to derive the hostname from the placement decisions. Starting a command-line Kafka consumer and inspecting InfoSphere Information Server messages Kafka comes with a command-line consumer that directs messages to a command window. Lightbend Console. KIP-354: Add a Maximum Log Compaction Lag. In the examples, you might need to add the extension according to your platform. It is part of the confluent suite. Run the console consumer against our topic with the following command: bin/kafka-console-consumer. Multiple clusters of the same type should be listed in the same `type. There are a few options. 0 on Ubuntu 18. Creating a Topic to Write to. Apache Kafka has become the leading distributed data streaming enterprise big data technology. This simulation test consists of 24 multiple choice questions and gives you the look and feel of the real Kafka certification exam. As the producer, each line in the input is considered a message from the producer. In this way it is a perfect example to demonstrate how. For example, you could use such a file to set all the properties needed for a SSL/SASL connection that the consumer will invoke. NET Framework and ability to run in any platform. sh --list--zookeeper localhost:2181 Push a file of messages to Kafka. Now you will be able to manage WildFly application server remotely with the server ip. Limiting the Size of Messages Logged to Kafka. The CURRENT-OFFSET is the last offset processed by a consumer and LOG_END_OFFSET, the last event offset written be a consumer. System tools can be run from the command line using the run class script (i. Kafka and the ELK Stack — usually these two are part of the same architectural solution, Kafka acting as a buffer in front of Logstash to ensure resiliency. bin/kafka-run-class. bin/kafka-console-producer. There are a few options. Apache Kafka is a distributed streaming platform designed for high volume publish-subscribe messages and streams. I can get the Port at which I can access the Kafka Brokers: And I can access the Kafka Manager at the indicated Port. LoanDataKafkaConsumer consumes the loan data messages from the Topic "raw_loan_data_ingest". If you are not familiar with Apache Kafka or want to learn about it, check out their site!. Below is the response : 1) What version of Kafka are you using? - 1. kafka-console-consumer is a consumer command line to read data from a Kafka topic and write it to standard output. Topic deletion is enabled by default in new Kafka versions ( from 1. As you can see in the first chapter, Kafka Key Metrics to Monitor, the setup, tuning, and operations of Kafka require deep insights into performance metrics such as consumer lag, I/O utilization, garbage collection and many more. System tools can be run from the command line using the run class script (i. consumer: tsc is not recognized as an internal or external command,. 255:2181 --topic eventbus. To learn Kafka easily, step-by-step, you have come to the right place! No prior Kafka knowledge is required. Graphite Command Line / Script X X ü the most Click on LAG to sort on consumer lag across. Hi Robert, Thanks for your response. To populate Zookeeper, bring up at least one broker using these command line options:--hostname uses Docker templates to derive the hostname from the placement decisions. I reproduce them here with the command line client. My introduction to Kafka was rough, and I hit a lot of gotchas along the way. In this video, I will provide a quick start demo. The first step is to start the Kafka and Zookeeper servers. The ~/ kafka /bin/kafka-console-producer. Let us understand how to use Command-line Kafka Producer and Consumer in the Kafka multi-node cluster/single node cluster. Kafka also has a powerful command that enables messages to be consumed from the command line. Service Checks The Kafka-consumer check does not include any service checks. Usually when I invite Apache Kafka to a project I end up with writing my own wrappers around Kafka’s Producers and Consumers. Kafka Messaging Modular Input: Kafka consumer is apparently connected, but how do we troubleshoot why we see no data? Running the command line invocation for the. Data exploration: enhanced SHOW TOPICS command. Beware, that the consumer will commit its offset to zookeeper after a certain interval (default 10 seconds), so if you run this command a few times in a row you'll likely see the offset remain constant whilst lag increases, until a commit from the consumer will suddenly bring the offset up and hence lag down significantly in one go. Run this command in its own terminal. The Kafka consumer config parameters may also have an impact on the performance of the spout. To consume messages we open a second bash shell and cd into the /bin directory as before, and to receive messages we use the kafka-console-consumer command line client: sudo. How do I build a system that makes it unlikely for consumers to lag? The answer is that you want to be able to add enough consumers to handle all the incoming data. Each line typed in the input is sent as a single message to the cluster. Kafka also has a powerful command that enables messages to be consumed from the command line. Kafka is used in production by over 33% of the Fortune 500 companies such as Netflix, Airbnb, Uber, Walmart and LinkedIn. For the list of configurations, please reference Apache Kafka page. The Console provides visibility for KPIs, reactive metrics, monitors and alerting, and includes a large selection of ready-to-use dashboards. Kafka is a fast-streaming messaging service equipped with a command-line client that takes standard input and converts it before sending that input out as messages. But, it can be painful too. - devshawn Apr 7 at 16:36. Messages should be one per line. Kafka is the leading open-source, enterprise-scale data streaming technology. bin/kafka-topics. Therefore, it is important to monitor the Kafka service and restart the kafkaloader if and when the Kafka service is interrupted. MemSQL extends our operational data platform with an on-demand, elastic cloud service, and new features to support Tier 1 workloads. Related Posts How To Compact Druid Data Segments Using Compaction Task. Learn online and earn valuable credentials from top universities like Yale, Michigan, Stanford, and leading companies like Google and IBM. some metrics of cluster health around topic consumer lag etc. This tool has been removed in Kafka 1. Kafka shell allows you to configure a list of clusters, and properties such as --bootstrap-server and --zookeeper for the currently selected cluster will automatically be added when the command is run. Step #6 : Using Kafka Consumer. sh and bin/kafka-console-consumer. • The producer side APIs add messages to the cluster for a topic. The metrics will be available only if Kafka is used as the consumer offset store. To find the lag in milliseconds between the timestamp of the most recently published message in a stream, topic, or partition and the timestamp of a consumer's most recently committed cursor, run the command stream cursor list. sh script, which is located in the bin directory of the Kafka distribution. If you do not specify a value for bootstrap. sh --broker-list localhost:9092 --topic test_topic < file. Spring Cloud Stream is a framework under the umbrella project Spring Cloud, which enables developers to build event-driven microservices with messaging systems like Kafka and RabbitMQ. Typically, you would publish messages using a Kafka client library from within your program, but since that involves different setups for different programming languages, you can use the shell script as a language-independent way of. ProducerPerformance test7 50000000 100 -1 acks=1 bootstrap. Create a Spring Kafka Kotlin Consumer. Kafka's mirroring feature makes it possible to maintain a replica of an existing Kafka cluster. Build an endpoint that we can pass in a message to be produced to Kafka. For example, Cap'n Proto requires the path to the schema file and the name of the root schema. If we did migrated from a previous Kafka version, according to the brokers configuration, Kafka can dual-writes the offsets into Zookeeper and Kafka's __consumer_offsets (see dual. Consumer metrics; There's a nice write up on which metrics are important to track per category. We do that by using a couple of Kafka command line tools that ship with any Kafka installation. Consumer Lag per Client. Creating a topic from the command line is very easy to do. It helps you move your data where you need it, in real time, reducing the headaches that come with integrations between multiple source and target systems. 0 or later) console tools work with IBM Event Streams and whether there are CLI equivalents. Apache Kafka is a versatile distributed messaging system, developed initially by LinkedIn in to handle their growing need for message processing. Apache Kafka Tutorial for Beginners in English - https:. 10-2 2) Can you read from the Kafka topic using the command line Kafka consumer utility that comes with the Kafka install?. Kafka Connect is a framework for connecting Kafka with external systems such as databases, key-value stores, search indexes, and file systems, using so-called Connectors. It can manage hundreds of metrics from all the components of Kafka (Broker, Producer and Consumer) to pinpoint consumer lag. Path to properties file where you can set the Consumer — similar to what you provide to Kafka command line tools. There are a few options. If you are not familiar with Apache Kafka or want to learn about it, check out their site!. In addition to the traditional support for Kafka version 0. Small utility to get consumer lag from Kafka-stored consumer offsets. The Datadog Agent emits an event when the value of the consumer_lag metric goes below 0, tagging it with topic, partition and consumer_group. Configuring your Kafka deployment to expose metrics. They are very essential when we work with Apache Kafka. Kafka is a key backbone of IoT streaming analytics applications. Confluent CLI¶. The metrics will be available only if Kafka is used as the consumer offset store. Usually I do this with the Kafka command line tools but I always forget the exact command to run which I have to look from different sources.