---
title: "Instrument Python agents | Grafana Cloud documentation"
description: "Install the AI Observability Python SDK and capture your first generation from a Python agent."
---

# Instrument Python agents

This guide shows you how to install the AI Observability Python SDK, instrument an LLM call, and verify that generation data reaches AI Observability.

> Note
> 
> AI Observability is referred to as “Sigil” in SDKs, package names, and configuration. For example, the Python package is `sigil-sdk` and imports use `sigil_sdk`.

## Before you begin

- A running AI Observability instance or Grafana Cloud stack with AI Observability enabled.
- Python 3.9 or later.
- Your AI Observability generation export endpoint URL.
- A Cloud Access Policy Token with the `sigil:write` scope. Refer to [Create an API key](/docs/grafana-cloud/machine-learning/ai-observability/get-started/grafana-cloud/#create-an-api-key) for step-by-step instructions.

## Install the SDK

Bash ![Copy code to clipboard](/media/images/icons/icon-copy-small-2.svg) Copy

```bash
pip install sigil-sdk
```

To use a provider helper, install the corresponding package:

Bash ![Copy code to clipboard](/media/images/icons/icon-copy-small-2.svg) Copy

```bash
pip install sigil-sdk-openai
pip install sigil-sdk-anthropic
pip install sigil-sdk-gemini
```

## Use a framework integration

If you use LangChain, LangGraph, OpenAI Agents, LlamaIndex, or Google ADK, install the corresponding framework package for automatic generation capture:

Bash ![Copy code to clipboard](/media/images/icons/icon-copy-small-2.svg) Copy

```bash
pip install sigil-sdk-langchain
pip install sigil-sdk-langgraph
pip install sigil-sdk-openai-agents
pip install sigil-sdk-llamaindex
pip install sigil-sdk-google-adk
```

Framework integrations inject callbacks that capture generations automatically. Refer to [Instrument agents with frameworks](/docs/grafana-cloud/machine-learning/ai-observability/guides/instrument-agents) for setup details.

## Capture a generation manually

To instrument calls without a framework, use the context manager:

Python ![Copy code to clipboard](/media/images/icons/icon-copy-small-2.svg) Copy

```python
from sigil_sdk import Client, ClientConfig, GenerationStart, ModelRef, assistant_text_message

client = Client(
    ClientConfig(
        generation_export_endpoint="<SIGIL_ENDPOINT>/api/v1/generations:export",
    )
)

with client.start_generation(
    GenerationStart(
        conversation_id="conv-1",
        model=ModelRef(provider="openai", name="gpt-4o"),
    )
) as rec:
    rec.set_result(output=[assistant_text_message("Hello from Sigil")])

client.shutdown()
```

Replace *SIGIL\_ENDPOINT* with your Sigil API address.

## Use a provider helper

Provider helpers capture generations automatically from your LLM client calls. For example, with OpenAI:

Python ![Copy code to clipboard](/media/images/icons/icon-copy-small-2.svg) Copy

```python
import openai
from sigil_sdk import Client, ClientConfig
from sigil_sdk_openai import chat

sigil = Client(
    ClientConfig(
        generation_export_endpoint="<SIGIL_ENDPOINT>/api/v1/generations:export",
    )
)
openai_client = openai.OpenAI()

response = chat.completions.create(
    sigil,
    openai_client,
    {"model": "gpt-4o", "messages": [{"role": "user", "content": "What is observability?"}]},
)

sigil.shutdown()
```

## Configure authentication

For Grafana Cloud, use basic auth with your Cloud Access Policy Token:

Python ![Copy code to clipboard](/media/images/icons/icon-copy-small-2.svg) Copy

```python
from sigil_sdk import Client, ClientConfig
from sigil_sdk.config import GenerationExportConfig, AuthConfig

client = Client(
    ClientConfig(
        generation_export=GenerationExportConfig(
            protocol="http",
            endpoint="<SIGIL_ENDPOINT>/api/v1/generations:export",
            auth=AuthConfig(
                mode="basic",
                tenant_id="<INSTANCE_ID>",
                basic_password="<API_KEY>",
            ),
        ),
    )
)
```

For self-hosted with tenant mode:

Python ![Copy code to clipboard](/media/images/icons/icon-copy-small-2.svg) Copy

```python
client = Client(
    ClientConfig(
        generation_export=GenerationExportConfig(
            protocol="http",
            endpoint="<SIGIL_ENDPOINT>/api/v1/generations:export",
            auth=AuthConfig(
                mode="tenant",
                tenant_id="<TENANT_ID>",
            ),
        ),
    )
)
```

## Set up traces and metrics

The SDK emits OpenTelemetry spans and metrics alongside generation data. To export them, configure a `TracerProvider` and `MeterProvider` in your application before creating the Sigil client. Without this, traces and metrics are silently lost.

Refer to [Set up traces and metrics](/docs/grafana-cloud/machine-learning/ai-observability/get-started/grafana-cloud/#set-up-traces-and-metrics) for Grafana Cloud OTLP options and [SDK configuration](/docs/grafana-cloud/machine-learning/ai-observability/configure/sdk/#opentelemetry-setup) for setup snippets.

## Verify data

Open the AI Observability plugin in Grafana and navigate to **Conversations**. Your generation should appear within a few seconds. Check your **Traces** and **Metrics** data sources for SDK-emitted spans and metrics.

## Next steps

- [Configure SDK options](/docs/grafana-cloud/machine-learning/ai-observability/configure/sdk)
- [Browse conversations](/docs/grafana-cloud/machine-learning/ai-observability/guides/conversations)
