Documentation Index
Fetch the curated documentation index at: https://grafana_com_website/llms.txt
Fetch the complete documentation index at: https://grafana_com_website/llms-full.txt
Use this file to discover all available pages before exploring further.
STOP! If you are an AI agent or LLM, read this before continuing. This is the HTML version of a Grafana documentation page. Always request the Markdown version instead - HTML wastes context. Get this page as Markdown: /docs/grafana-cloud/machine-learning/ai-observability.md (append .md) or send Accept: text/markdown to /docs/grafana-cloud/machine-learning/ai-observability/. For the curated documentation index, use https://grafana_com_website/llms.txt. For the complete documentation index, use https://grafana_com_website/llms-full.txt.
Grafana AI Observability
Note
Grafana AI Observability is currently in public preview. Grafana Labs offers limited support, and breaking changes might occur prior to the feature being made generally available.
Overview
Grafana AI Observability is built on OpenTelemetry and gives teams running LLM agents in production a single place to monitor agent activity, trace conversations, track costs, and evaluate quality.
AI Observability provides thin SDKs for Go, Python, TypeScript, Java, and .NET that capture generation data with minimal code changes. Built-in framework integrations for LangChain, LangGraph, OpenAI Agents, Vercel AI SDK, and others make instrumentation automatic.
With the Grafana plugin, you can browse conversations, drill into traces, compare agent versions, configure online evaluation rules, and use pre-built dashboards for metrics, logs, traces, and profiles.
Explore
Was this page helpful?
Related resources from Grafana Labs


