Help build the future of open source observability software Open positions

Check out the open source projects we support Downloads

Troubleshooting and FAQ

Important: This documentation is about an older version. It's relevant only to the release noted, many of the features and functions have been updated or replaced. Please view the current version.

Warning

Grafana Assistant is currently in Private Preview - some or all features are subject to change without warning. If you are interested in trying it out, please sign up by filling out this form.

Troubleshooting and FAQ

Please use this form to ask a question or provide feedback about the Assistant.

What is Grafana Assistant?

Grafana Assistant is a purpose-built agentic LLM integration for Grafana. It’s designed to help you explore your data and get answers to your questions.

How does it work?

The Assistant uses agent design techniques and carefully crafted prompts to understand user intent and provide accurate responses.

The Assistant integrates with various Grafana tools and APIs to access and manipulate data. Most tools run in the browser as the user.

The assistant uses a combination of natural language processing and machine learning to understand your questions and help you explore your data. It uses tools to interact with your data, such as querying Prometheus or Loki. The assistant is ‘agentic’, which means it can perform multiple steps in a conversation, allowing it to conduct investigations.

How do I sign up for the Private Preview?

If you’re interested in trying Grafana Assistant in your own stack with your own data, please fill out this form.

A member of our team will reach out when we’re ready to onboard you.

Due to high demand, there may be some delays, but we’ll do our best to accommodate you.

Can I bring my own model?

No. Grafana’s use of LLMs is more advanced than most agentic cases given the breadth of apps and services available across the platform. This requires careful prompt engineering as well as tool selection - changes in one part of the prompt can dramatically affect behavior.

While building Grafana Assistant, we have found prompts to be tightly coupled to individual models. A prompt might work well for one model but doesn’t perform as well with another model. We also noticed that some models are better at specific tasks than others, so we may select models based on this in the background.

Swapping LLM models would not likely achieve the same quality of experience and usefulness. While we’re not closed off to this possibility, our priority is delivering a good experience for most people directly into Grafana Cloud right now, using the best models for the job.

In the future, we may explore “bring your own model” as a method to run Grafana Assistant.

Will this be available in open source or Grafana Enterprise?

No. We plan to build this feature within Grafana Cloud so that, especially in the early phases of the project, we can have as much control over the user experience as possible.

For Grafana Enterprise, many people are pinned to past versions of Grafana. The effort to properly support each of these versions when the Assistant has to be so carefully integrated would prevent us from providing a high-quality app for everybody. As usual, this is not ’no’ forever, but for now.

For customers who want to bring more AI capabilities into their stack, Grafana’s official open-source MCP server can be an alternative.