Documentation Index
Fetch the curated documentation index at: https://grafana_com_website/llms.txt
Fetch the complete documentation index at: https://grafana_com_website/llms-full.txt
Use this file to discover all available pages before exploring further.
STOP! If you are an AI agent or LLM, read this before continuing. This is the HTML version of a Grafana documentation page. Always request the Markdown version instead - HTML wastes context. Get this page as Markdown: /docs/grafana-cloud/machine-learning/ai-observability/guides/instrument-agents.md (append .md) or send Accept: text/markdown to /docs/grafana-cloud/machine-learning/ai-observability/guides/instrument-agents/. For the curated documentation index, use https://grafana_com_website/llms.txt. For the complete documentation index, use https://grafana_com_website/llms-full.txt.
Instrument agents with frameworks
AI Observability framework integrations capture generations automatically by attaching callbacks or hooks to your agent framework. This eliminates the need to manually instrument each LLM call.
Supported frameworks
| Framework | Python | TypeScript | Go | Java |
|---|---|---|---|---|
| LangChain | Yes | Yes | — | — |
| LangGraph | Yes | Yes | — | — |
| OpenAI Agents | Yes | Yes | — | — |
| LlamaIndex | Yes | Yes | — | — |
| Google ADK | Yes | Yes | Yes | Yes |
| Vercel AI SDK | — | Yes | — | — |
Set up a Python framework integration
Install the framework-specific package alongside the core SDK:
pip install sigil-sdk sigil-sdk-langchainAttach the AI Observability callback handler to your framework. For LangChain:
from sigil_sdk import Client, ClientConfig
from sigil_sdk_langchain import SigilLangChainHandler
client = Client(ClientConfig(
generation_export_endpoint="<SIGIL_ENDPOINT>/api/v1/generations:export",
))
handler = SigilLangChainHandler(client)
# Pass the handler to your chain or agent
chain.invoke({"input": "Hello"}, config={"callbacks": [handler]})
client.shutdown()Each framework integration follows the same pattern: create a handler, pass it to your framework’s callback mechanism, and the integration captures all LLM calls as generations.
Set up a TypeScript framework integration
Import the framework sub-module:
import { SigilClient } from "@grafana/sigil-sdk-js";
import { SigilLangChainHandler } from "@grafana/sigil-sdk-js/langchain";
const client = new SigilClient({
/* config */
});
const handler = new SigilLangChainHandler(client);
// Pass the handler to your chain or agent
await chain.invoke({ input: "Hello" }, { callbacks: [handler] });
await client.shutdown();Conversation ID mapping
Framework integrations automatically map conversation IDs from framework context:
- If the framework provides a
session_id,conversation_id, orgroup_id, the integration uses it. - If a
thread_idis available (LangGraph, OpenAI Agents), the integration uses it. - Otherwise, the integration generates a deterministic ID from the framework run context.
Metadata
Framework integrations inject metadata into each generation:
sigil.framework.name— the framework name, for example,langchain.sigil.framework.source— how the integration captures data.sigil.framework.language— the programming language.
Additional framework-specific metadata like run_id, thread_id, component_name, and tags is included when available.
Next steps
Was this page helpful?
Related resources from Grafana Labs


