Plugins 〉LLM
LLM
Grafana LLM app (experimental)
This is a Grafana application plugin which centralizes access to LLMs across Grafana.
It is responsible for:
- storing API keys for LLM providers
- proxying requests to LLMs with auth, so that other Grafana components need not store API keys
- providing Grafana Live streams of streaming responses from LLM providers (namely OpenAI)
- providing LLM based extensions to Grafana's extension points (e.g. 'explain this panel')
Future functionality will include:
- support for multiple LLM providers, including the ability to choose your own at runtime
- rate limiting of requests to LLMs, for cost control
- token and cost estimation
- RBAC to only allow certain users to use LLM functionality
Note: This plugin is experimental, and may change significantly between versions, or deprecated completely in favor of a different approach based on user feedback.
For users
Install and configure this plugin to enable various LLM-related functionality across Grafana. This will include new functionality inside Grafana itself, such as explaining panels, or in plugins, such as natural language query editors.
All LLM requests will be routed via this plugin, which ensures the correct API key is being used and rate limited appropriately.
For plugin developers
This plugin is not designed to be directly interacted with; instead, use the
convenience functions in the
@grafana/experimental
package which will communicate with this plugin, if installed.
First, add the correct version of @grafana/experimental
to your dependencies in package.json:
{
"dependencies": {
"@grafana/experimental": "1.7.0"
}
}
Then in your components you can use the llm
object from @grafana/experimental
like so:
import React, { useState } from 'react';
import { useAsync } from 'react-use';
import { scan } from 'rxjs/operators';
import { llms } from ‘@grafana/experimental’;
import { PluginPage } from ‘@grafana/runtime’;
import { Button, Input, Spinner } from ‘@grafana/ui’;
const MyComponent = (): JSX.Element => {
const [input, setInput] = React.useState(’’);
const [message, setMessage] = React.useState(’’);
const [reply, setReply] = useState(’’);
const { loading, error } = useAsync(async () => {
const enabled = await llms.openai.enabled();
if (!enabled) {
return false;
}
if (message === ‘’) {
return;
}
// Stream the completions. Each element is the next stream chunk.
const stream = llms.openai.streamChatCompletions({
model: ‘gpt-3.5-turbo’,
messages: [
{ role: ‘system’, content: ‘You are a cynical assistant.’ },
{ role: ‘user’, content: message },
],
}).pipe(
// Accumulate the stream chunks into a single string.
scan((acc, delta) => acc + delta, ‘’)
);
// Subscribe to the stream and update the state for each returned value.
return stream.subscribe(setReply);
}, [message]);
if (error) {
// TODO: handle errors.
return null;
}
return (
<div>
<Input
value={input}
onChange={(e) => setInput(e.currentTarget.value)}
placeholder=“Enter a message”
/>
<br />
<Button type=“submit” onClick={() => setMessage(input)}>Submit</Button>
<br />
<div>{loading ? <Spinner /> : reply}</div>
</div>
);
}
Grafana Cloud Pro
- $25 / user / month and includes a free trial for new users
- Available with a Grafana Cloud Pro plan
- Access to 1 Enterprise plugin
- Fully managed service (not available to self-manage)
Grafana Cloud Advanced / Grafana Enterprise
- Available with a Grafana Cloud Advanced plan or Grafana Enterprise license
- Access to all Enterprise plugins
- Run fully managed or self-manage on your own infrastructure
Installing LLM on Grafana Cloud:
Installing plugins on a Grafana Cloud instance is a one-click install; same with updates. Cool, right?
Note that it could take up to 1 minute to see the plugin show up in your Grafana.
Installing plugins on a Grafana Cloud instance is a one-click install; same with updates. Cool, right?
Note that it could take up to 1 minute to see the plugin show up in your Grafana.
Installing plugins on a Grafana Cloud instance is a one-click install; same with updates. Cool, right?
Note that it could take up to 1 minute to see the plugin show up in your Grafana.
Installing plugins on a Grafana Cloud instance is a one-click install; same with updates. Cool, right?
Note that it could take up to 1 minute to see the plugin show up in your Grafana.
Installing plugins on a Grafana Cloud instance is a one-click install; same with updates. Cool, right?
Note that it could take up to 1 minute to see the plugin show up in your Grafana.
For more information, visit the docs on plugin installation.
Installing on a local Grafana:
For local instances, plugins are installed and updated via a simple CLI command. Plugins are not updated automatically, however you will be notified when updates are available right within your Grafana.
1. Install the Application
Use the grafana-cli tool to install LLM from the commandline:
grafana-cli plugins install
The plugin will be installed into your grafana plugins directory; the default is /var/lib/grafana/plugins. More information on the cli tool.
Alternatively, you can manually download the .zip file for your architecture below and unpack it into your grafana plugins directory.
Alternatively, you can manually download the .zip file and unpack it into your grafana plugins directory.
2. Enable it
Next, log into your Grafana instance. Navigate to the Plugins section, found in your Grafana main menu.
Click the Apps tabs in the Plugins section and select the newly installed app.
To enable the app, click the Config tab. Follow the instructions provided with the application and click Enable. The app and any new UI pages are now accessible from within the main menu, as designed by the app creator.
If dashboards have been included with the application, they will attempt to be automatically installed. To view the dashboards, re-import or delete individual dashboards, click the Dashboards tab within the app page.
Changelog
0.2.1
- Change path handling for chat completions streams to put separate requests into separate streams. Requests can pass a UUID as the suffix of the path now, but is backwards compatible with an older version of the frontend code.
0.2.0
- Expose vector search API to perform semantic search against a vector database using a configurable embeddings source
0.1.0
- Support proxying LLM requests from Grafana to OpenAI