Loki is a log aggregation system designed to store and query logs from all your applications and infrastructure.
The easiest way to get started is with Grafana Cloud, our fully composable observability stack.
Why use Grafana Loki?
Loki takes a unique approach by only indexing the metadata rather than the full text of the log lines:
How does Grafana Loki work?
Pull in any logs with Promtail
Promtail is a logs collector built specifically for Loki. It uses the same service discovery as Prometheus and includes analogous features for labeling, transforming, and filtering logs before ingestion into Loki.
Store the logs in Loki
Loki does not index the text of logs. Instead, entries are grouped into streams and indexed with labels.Not only does this reduce costs, it also means log lines are available to query within milliseconds of being received by Loki.
Use LogQL to explore
Use Loki’s powerful query language, LogQL, to explore your logs. Run LogQL queries directly within Grafana to visualize your logs alongside other data sources, or with LogCLI, for those who prefer a command line experience.
Alert on your logs
Set up alerting rules for Loki to evaluate on your incoming log data. Configure Loki to send the resulting alerts to a Prometheus Alertmanager so they can then get routed to the right team.
Built on open source, driven by the community
Choose the version that works best for you
Horizontally scalable, highly available, multi-tenant log aggregation system inspired by Prometheus.
For users who prefer to set up, administer, and maintain their own installation.
Offered as a fully managed service, Grafana Cloud Logs is a lightweight and cost-effective log aggregation system based on Grafana Loki.
Managed and administered by Grafana Labs with free and paid options for individuals, teams, and large enterprises.
Get up to 50GB of logs at no cost in the free tier of Grafana Cloud.
A self-managed logging solution that runs securely at scale with expert support from Grafana Labs.
A self-managed option for organizations that have special requirements around data localization and privacy.