Menu
Grafana Cloud

Set up instrumentation for AI observability in Python

Follow these steps to get started quickly with Grafana Cloud AI Observability:

  1. Install the AI Observability Integration
  2. Instrument your AI Application
  3. Monitor using the pre-built GenAI Observability dashboard

Install the AI Observability Integration

To install the AI Observability integration:

  1. In your Grafana Cloud stack, click Connections in the left-side menu.

  2. Search for the name AI Observability.

  3. Click the AI Observability card and follow the instructions to instrument your application. Alternatively, follow the instructions in Instrument your AI application.

  4. Click Install dashboards to install the pre-built GenAI Observability dashboard.

Instrument your AI application

To instrument your AI application using Grafana Cloud, follow these steps:

  1. Sign in to Grafana Cloud:

    • If you don’t have an account, register for a free Grafana Cloud account.
    • Go to the Grafana Cloud Portal.
    • If you have access to multiple organizations, select one from the top-left dropdown.
    • If your organization has multiple stacks, pick a stack from the sidebar or the main stack list.
  2. Configure OpenTelemetry:

    • Click Configure under the OpenTelemetry section.
  3. Generate an API Token:

    • In the Password / API Token section, click Generate now.
    • Name the token, for example, chatbot.
    • Click Create token.
    • Click Close. You don’t need to copy the token.
    • Scroll down and copy the OTEL_EXPORTER_OTLP_ENDPOINT and OTEL_EXPORTER_OTLP_HEADERS values. Save these for later.
  4. Install the OpenLIT Telemetry SDK

    • AI Observability uses the OpenLIT SDK to provide auto-instrumentation for many tools in generative AI stacks, such as LLMs, vector databases, and frameworks like LangChain. OpenLIT OpenTelemetry traces and metrics. To install the Python SDK, run the following command:
    python
    pip install openlit
  5. Update Your Application Code:

    • Add the following lines to your application:

      python
      import openlit
      
      openlit.init()
      
      // Rest of your application code
  6. Set Up the OTEL Endpoint and Headers:

    • Run these commands in your shell to configure the endpoint and headers:

      shell
      export OTEL_EXPORTER_OTLP_ENDPOINT="<YOUR_GRAFANA_OTEL_GATEWAY_URL>"
      export OTEL_EXPORTER_OTLP_HEADERS="<YOUR_GRAFANA_OTEL_GATEWAY_AUTH>"

      Replace:

      • <YOUR_GRAFANA_OTEL_GATEWAY_URL> with the OTEL_EXPORTER_OTLP_ENDPOINT value you copied earlier. For example: https://otlp-gateway-<ZONE>.grafana.net/otlp
      • YOUR_GRAFANA_OTEL_GATEWAY_AUTH with the OTEL_EXPORTER_OTLP_HEADERS value. For example: Authorization=Basic%20<BASE64 ENCODED INSTANCE ID AND API TOKEN>

Monitor using the pre-built GenAI Observability dashboard

When you run your instrumented AI application, the OpenLIT SDK automatically starts sending OpenTelemetry traces and metrics about your LLM and vector database usage to Grafana Cloud.

Open the GenAI Observability dashboard you previously installed from the integration to visualize the instrumentation data.