What's new from Grafana Labsbreadcrumb arrow Access tracing data using MCP server in Grafana Cloud Traces
What's new from Grafana Labs
What's new from Grafana Labs
Grafana Cloud Available in public preview Open source Available in public preview Traces
Release date: 2025-08-08

Access tracing data using MCP server in Grafana Cloud Traces

We’re excited to announce the integration of the Model Context Protocol (MCP) into Grafana Cloud Traces and into open-source Tempo (merged, available in Tempo 2.9). MCP, a standard developed by Anthropic, allows data sources to expose data and functionality to Large Language Models (LLMs) via an agent.

This integration opens up new possibilities for interacting with tracing data. You can now connect LLM-powered tools like Claude Code or Cursor to Grafana Cloud Traces, enabling you to:

  • Explore services and understand interactions: LLMs can be used to teach new developers about service interactions within an application by analyzing tracing data. For instance, a new developer could ask the AI to explain how their services interact. The AI would use live tracing data from Cloud Traces to answer these questions.
  • Diagnose and investigate errors: You can leverage LLMs to identify and diagnose errors in your systems. The AI can answer questions like “Are there errors in my services?”, “What endpoints are being impacted”, etc?
  • Optimize performance and reduce latency: LLMs can assist in identifying the causes of latency and guiding optimization efforts. By analyzing trace data, an LLM can summarize operations in a request path, pinpoint bottlenecks, and even suggest code changes to improve performance, such as parallelizing operations.

Setting up an LLM agent with Grafana Cloud Traces requires a Grafana Cloud API token and configuration. This capability is under active development and we welcome early adopters to try it out and share feedback.


Related What's new posts