Documentation Index
Fetch the curated documentation index at: https://grafana_com_website/llms.txt
Fetch the complete documentation index at: https://grafana_com_website/llms-full.txt
Use this file to discover all available pages before exploring further.
STOP! If you are an AI agent or LLM, read this before continuing. This is the HTML version of a Grafana documentation page. Always request the Markdown version instead - HTML wastes context. Get this page as Markdown: /docs/grafana-cloud/machine-learning/ai-observability/get-started/javascript.md (append .md) or send Accept: text/markdown to /docs/grafana-cloud/machine-learning/ai-observability/get-started/javascript/. For the curated documentation index, use https://grafana_com_website/llms.txt. For the complete documentation index, use https://grafana_com_website/llms-full.txt.
Instrument TypeScript and JavaScript agents
This guide shows you how to install the AI Observability JavaScript SDK, instrument an LLM call, and verify that generation data reaches AI Observability.
Note
AI Observability is referred to as “Sigil” in SDKs, package names, and configuration. For example, the npm package is
@grafana/sigil-sdk-js.
Before you begin
- A running AI Observability instance or Grafana Cloud stack with AI Observability enabled.
- Node.js 22 or later.
- Your AI Observability generation export endpoint URL.
- A Cloud Access Policy Token with the
sigil:writescope. Refer to Create an API key for step-by-step instructions.
Install the SDK
npm install @grafana/sigil-sdk-jsUse a framework integration
If you use LangChain, LangGraph, OpenAI Agents, LlamaIndex, Google ADK, or Vercel AI SDK, import the framework sub-module for automatic generation capture:
@grafana/sigil-sdk-js/langchain@grafana/sigil-sdk-js/langgraph@grafana/sigil-sdk-js/openai-agents@grafana/sigil-sdk-js/llamaindex@grafana/sigil-sdk-js/google-adk@grafana/sigil-sdk-js/vercel-ai-sdk
Each integration attaches callbacks or hooks that capture generations automatically. Refer to Instrument agents with frameworks for setup details.
Capture a generation manually
To instrument calls without a framework:
import { SigilClient } from "@grafana/sigil-sdk-js";
const client = new SigilClient({
generationExport: {
protocol: "http",
endpoint: "<SIGIL_ENDPOINT>/api/v1/generations:export",
auth: { mode: "tenant", tenantId: "<TENANT_ID>" },
},
});
await client.startGeneration(
{
conversationId: "conv-1",
model: { provider: "openai", name: "gpt-4o" },
},
async (recorder) => {
recorder.setResult({
output: [{ role: "assistant", content: "Hello from Sigil" }],
});
},
);
await client.shutdown();Replace SIGIL_ENDPOINT and TENANT_ID with your values.
Use a provider helper
The SDK includes helpers for OpenAI, Anthropic, and Gemini. For example, with OpenAI:
import { SigilClient, openai } from "@grafana/sigil-sdk-js";
const sigil = new SigilClient({
/* config */
});
const response = await openai.chat.completions.create(sigil, openaiClient, {
model: "gpt-4o",
messages: [{ role: "user", content: "What is observability?" }],
});
await sigil.shutdown();Configure authentication
For Grafana Cloud, use basic auth with your Cloud Access Policy Token:
const client = new SigilClient({
generationExport: {
protocol: "http",
endpoint: "<SIGIL_ENDPOINT>/api/v1/generations:export",
auth: {
mode: "basic",
tenantId: "<INSTANCE_ID>",
basicPassword: "<API_KEY>",
},
},
});Set up traces and metrics
The SDK emits OpenTelemetry spans and metrics alongside generation data. To export them, configure a TracerProvider and MeterProvider in your application before creating the Sigil client. Without this, traces and metrics are silently lost.
Refer to Set up traces and metrics for Grafana Cloud OTLP options and SDK configuration for setup snippets.
Verify data
Open the AI Observability plugin in Grafana and navigate to Conversations. Your generation should appear within a few seconds. Check your Traces and Metrics data sources for SDK-emitted spans and metrics.
Next steps
Was this page helpful?
Related resources from Grafana Labs


