Plugins 〉Kafka


Developer

Hamed Karbasi

Sign up to receive occasional product news and updates:


Grafana Cloud
Grafana Cloud
  • Grafana, of course
  • 10k series Prometheus metrics
  • 50 GB logs
  • 50 GB traces
  • 2,232 app o11y host hours
  • ...and more
Create free account

No credit card needed, ever.


Data Source
community

Kafka

  • Overview
  • Installation
  • Change log
  • Related content

Kafka Datasource for Grafana

License CI Release Go Version Grafana v10.2+


Visualize real-time Kafka data in Grafana dashboards.


Why Kafka Datasource?

  • Live streaming: Monitor Kafka topics in real time.
  • Flexible queries: Select topics, partitions, offsets, and timestamp modes.
  • Rich JSON support: Handles flat, nested, and array data.
  • Secure: SASL authentication & SSL/TLS encryption.
  • Easy setup: Install and configure in minutes.

How It Works

This plugin connects your Grafana instance directly to Kafka brokers, allowing you to query, visualize, and explore streaming data with powerful time-series panels and dashboards.

Requirements

  • Apache Kafka v0.9+
  • Grafana v10.2+

Note: This is a backend plugin, so the Grafana server should have access to the Kafka broker.

Features

  • Real-time monitoring of Kafka topics
  • Query all or specific partitions
  • Autocomplete for topic names
  • Flexible offset options (latest, last N, earliest)
  • Timestamp modes (Kafka event time, dashboard received time)
  • Advanced JSON support (flat, nested, arrays, mixed types)
  • Kafka authentication (SASL) & encryption (SSL/TLS)

Installation

Via grafana-cli

grafana-cli plugins install hamedkarbasi93-kafka-datasource

Via zip file

Download the latest release and unpack it into your Grafana plugins directory (default: /var/lib/grafana/plugins).

Provisioning

You can automatically configure the Kafka datasource using Grafana's provisioning feature. For a ready-to-use template and configuration options, refer to provisioning/datasources/datasource.yaml in this repository.

Usage

Configuration

  1. Add a new data source in Grafana and select "Kafka Datasource".
  2. Configure connection settings:
    • Broker address (e.g. localhost:9094 or kafka:9092)
    • Authentication (SASL, SSL/TLS, optional)
    • Timeout settings (default: two seconds)

Build the Query

  1. Create a new dashboard panel in Grafana.
  2. Select your Kafka data source.
  3. Configure the query:
    • Topic: Enter or select your Kafka topic (autocomplete available).
    • Fetch Partitions: Click to retrieve available partitions.
    • Partition: Choose a specific partition or "all" for all partitions.
    • Offset Reset:
      • latest: Only new messages
      • last N messages: Start from the most recent N messages (set N in the UI)
      • earliest: Start from the oldest message
    • Timestamp Mode: Choose between Kafka event time or dashboard received time.

Tip: Numeric fields become time series, string fields are labels, arrays and nested objects are automatically flattened for visualization.

Supported JSON Structures

  • Flat objects
  • Nested objects (flattened)
  • Top-level arrays
  • Mixed types

Examples:

Simple flat object:

{
	"temperature": 23.5,
	"humidity": 65.2,
	"status": "active"
}

Nested object (flattened as user.name, user.age, settings.theme):

{
	"user": {
		"name": "John Doe",
		"age": 30
	},
	"settings": {
		"theme": "dark"
	}
}

Top-level array (flattened as item_0.id, item_0.value, item_1.id, etc.):

[
	{"id": 1, "value": 10.5},
	{"id": 2, "value": 20.3}
]

Limitations

  • Max flattening depth: 5
  • Max fields per message: 1000
  • Protobuf/AVRO not yet supported

Live Demo

Kafka dashboard

Sample Data Generator

Want to test the plugin? Use our Go sample producer to generate realistic Kafka messages:

go run example/go/producer.go -broker localhost:9094 -topic test -interval 500 -num-partitions 3 -shape nested

Supports flat, nested, and array JSON payloads. See example/README.md for details.

FAQ & Troubleshooting

  • Can I use this with any Kafka broker? Yes, supports Apache Kafka v0.9+ and compatible brokers.
  • Does it support secure connections? Yes, SASL and SSL/TLS are supported.
  • What JSON formats are supported? Flat, nested, arrays, mixed types.
  • How do I generate test data? Use the included Go or Python producers.
  • Where do I find more help? See this README or open an issue.

Documentation & Links


Support & Community

If you find this plugin useful, please consider giving it a ⭐ on GitHub or supporting development:

Buy Me a Coffee

For more information, see the documentation files above or open an issue/PR.

Installing Kafka on Grafana Cloud:

For more information, visit the docs on plugin installation.

Installing on a local Grafana:

For local instances, plugins are installed and updated via a simple CLI command. Plugins are not updated automatically, however you will be notified when updates are available right within your Grafana.

1. Install the Data Source

Use the grafana-cli tool to install Kafka from the commandline:

grafana-cli plugins install 

The plugin will be installed into your grafana plugins directory; the default is /var/lib/grafana/plugins. More information on the cli tool.

2. Configure the Data Source

Accessed from the Grafana main menu, newly installed data sources can be added immediately within the Data Sources section.

Next, click the Add data source button in the upper right. The data source will be available for selection in the Type select box.

To see a list of installed data sources, click the Plugins item in the main menu. Both core data sources and installed data sources will appear.

Changelog

v1.0.1

Full Changelog

  • Restructure and split documentation (README, docs/development.md, docs/contributing.md, docs/code_of_conduct.md)
  • Add “Create Plugin Update” GitHub Action
  • Add release workflow pre‐check to ensure tag exists in CHANGELOG
  • Bump plugin and package versions to 1.0.1

v1.0.0 (2025-08-14)

Full Changelog

  • Add support for the nested JSON #83 (hoptical)
  • Support for Nested JSON via Automatic Flattening #79

v0.6.0 (2025-08-13)

Full Changelog

  • Use kafka message time as default option #82 (hoptical)

  • Add options to auto offset reset #81 (hoptical)

  • Add/support all partitions #80 (hoptical)

  • Topic Selection with Autocomplete #78

  • Improve the offset reset field #76

  • Add support for selecting all partitions #75

  • Is it possible to reset offset? #47

  • Clarify the options of the timestamp mode #77

v0.5.0 (2025-08-01)

Full Changelog

v0.4.0 (2025-07-19)

Full Changelog

v0.3.0 (2025-05-24)

Full Changelog

  • Fix/release #65 (hoptical)

  • Fix/streaming logic #64 (hoptical)

  • add value-offset to go producer example #63 (hoptical)

  • Update/grafana go sdk #62 (hoptical)

  • Add/golang example #60 (hoptical)

  • Add/redpanda console #58 (hoptical)

  • Add Kafka to docker compose #54 (sizovilya)

  • [BUG] Cannot connect to the brokers #55

  • [BUG] Plugin Unavailable #45

  • plugin unavailable #44

  • [BUG] 2 Kafka panels on 1 dashboard #39

  • [BUG] The yarn.lock is out of sync with package.json since the 0.2.0 commit #35

  • User has to refresh page to trigger streaming #28

  • Mage Error #6

  • use access policy token instead of the legacy one #66 (hoptical)

  • Fix switch component #61 (sizovilya)

  • Change Kafka driver #57 (sizovilya)

  • Provide default options to UI #56 (sizovilya)

  • Fix CI Failure #53 (hoptical)

  • Add developer-friendly environment #52

  • Migrate from Grafana toolkit #50

  • Can you make an arm64 compatible version of this plugin? #49

  • Add Authentication & Authorization Configuration #20

  • Migrate from toolkit #51 (sizovilya)

  • Update README.md #46 (0BVer)

  • Add support for using AWS MSK Managed Kafka #38 (ksquaredkey)

  • Issue 35 update readme node14 for dev (#1) #37 (ksquaredkey)

v0.2.0 (2022-07-26)

Full Changelog

v0.1.0 (2021-11-14)

Full Changelog

* This Changelog was automatically generated by github_changelog_generator