Set up instrumentation for AI observability in Python
Follow these steps to get started quickly with Grafana Cloud AI Observability:
- Install the AI Observability Integration
- Instrument your AI Application
- Monitor using the pre-built GenAI Observability dashboard
Install the AI Observability Integration
To install the AI Observability integration:
In your Grafana Cloud stack, click Connections in the left-side menu.
Search for the name
AI Observability
.Click the AI Observability card and follow the instructions to instrument your application. Alternatively, follow the instructions in Instrument your AI application.
Click Install dashboards to install the pre-built GenAI Observability dashboard.
Instrument your AI application
To instrument your AI application using Grafana Cloud, follow these steps:
Sign in to Grafana Cloud:
- If you don’t have an account, register for a free Grafana Cloud account.
- Go to the Grafana Cloud Portal.
- If you have access to multiple organizations, select one from the top-left dropdown.
- If your organization has multiple stacks, pick a stack from the sidebar or the main stack list.
Configure OpenTelemetry:
- Click Configure under the OpenTelemetry section.
Generate an API Token:
- In the Password / API Token section, click Generate now.
- Name the token, for example,
chatbot
. - Click Create token.
- Click Close. You don’t need to copy the token.
- Scroll down and copy the OTEL_EXPORTER_OTLP_ENDPOINT and OTEL_EXPORTER_OTLP_HEADERS values. Save these for later.
Install the OpenLIT Telemetry SDK
- AI Observability uses the OpenLIT SDK to provide auto-instrumentation for many tools in generative AI stacks, such as LLMs, vector databases, and frameworks like LangChain. OpenLIT OpenTelemetry traces and metrics. To install the Python SDK, run the following command:
pip install openlit
Update Your Application Code:
Add the following lines to your application:
import openlit openlit.init() // Rest of your application code
Set Up the OTEL Endpoint and Headers:
Run these commands in your shell to configure the endpoint and headers:
export OTEL_EXPORTER_OTLP_ENDPOINT="<YOUR_GRAFANA_OTEL_GATEWAY_URL>" export OTEL_EXPORTER_OTLP_HEADERS="<YOUR_GRAFANA_OTEL_GATEWAY_AUTH>"
Replace:
<YOUR_GRAFANA_OTEL_GATEWAY_URL>
with theOTEL_EXPORTER_OTLP_ENDPOINT
value you copied earlier. For example:https://otlp-gateway-<ZONE>.grafana.net/otlp
YOUR_GRAFANA_OTEL_GATEWAY_AUTH
with theOTEL_EXPORTER_OTLP_HEADERS
value. For example:Authorization=Basic%20<BASE64 ENCODED INSTANCE ID AND API TOKEN>
Monitor using the pre-built GenAI Observability dashboard
When you run your instrumented AI application, the OpenLIT SDK automatically starts sending OpenTelemetry traces and metrics about your LLM and vector database usage to Grafana Cloud.
Open the GenAI Observability dashboard you previously installed from the integration to visualize the instrumentation data.