LogsCollect logs with Grafana Agent

Collect logs with Grafana Agent

The Grafana Cloud stack includes a logging service powered by Grafana Loki, a Prometheus-inspired log aggregation system. This means that you are not required to run your own Loki environment, though you can ship logs to Grafana Cloud using Promtail or another supported client if you wish to maintain a self-hosted Loki environment. See Collect logs with Promtail.

Prerequisites

  • A Grafana Cloud account
  • An application or system generating logs

Step 1. Install the Grafana Agent

Grafana Agent supports collecting logs and sending them to Loki using its loki subsystem. This is done using the upstream Promtail client, which is the official first-party log collection client created by the Loki developer team. Grafana Agent is usually deployed to every machine that has log data to be monitored. For options to horizontally scale your deployment of Grafana Agents, see this operation guide.

With the goal of collecting logs, the Grafana Agent can be installed the following ways:

For other methods of installing Grafana Agent for collecting metrics and/or traces, see Getting started with Grafana Agent and the Quickstart guides.

Step 2. Review the Grafana Agent configuration file

The Grafana Agent configuration file contents and location will depend on the installation options discussed previously. For standalone installs on a single host in a Linux environment, the agent configuration is stored in /etc/grafana-agent.yaml by default.

For more details about managing the Grafana Agent in a Kubernetes cluster shipping logs, please see the Grafana Agent Logs Kubernetes Quickstart

NOTE: Read Loki label best practices to learn how to use labels effectively for the best experience.

Some integrations will configure the Grafana Agent YAML configuration file to ship logs by default. Follow the instructions provided in the Integrations install process as needed.

If you would like to add additional sections for logs in other locations or with other filenames, see these examples below.

In this example, a job is added to send anything ending in log from the location /var/log/. The job is added just below the scrape_configs: section and before any other job_name sections:

    scrape_configs:
    - job_name: varlogs
      static_configs:
        - targets: [localhost]
          labels:
            job: varlogs
            __path__: /var/log/*log
    - job_name: applogs

Here is an example for dmesg logs:

      - job_name: dmesg
        static_configs:
          - targets: [localhost]
            labels:
              job: dmesg
              __path__: /var/log/dmesg

Here is another example, scraping logs for a Minecraft server with logs stored in a subdirectory of the /home directory of a special minecraft user.

      - job_name: minecraftlog
        static_configs:
          - targets: [localhost]
            labels:
              job: minecraft
              __path__: /home/MCuser/minecraft/logs/latest.log

NOTE: You will need to add the Grafana Agent user as an owner of any custom log location you intend to collect from. For example, add the grafana-agent user to the group adm which owns /var/syslog (the group name might be different on your system depending on your Linux distribution and the log location) like this:

sudo usermod -a -G adm grafana-agent

Anytime you change the agent configuration, you must restart the agent for the new configuration to take effect.

sudo systemctl restart grafana-agent.service

To check the status of Grafana Agent:

sudo systemctl status grafana-agent.service

For a troubleshooting guide for issues with Grafana Agent, see Troubleshooting the Grafana Agent

For more details about the logs_config block in the Grafana Agent YAML configuration file, see Configure Grafana Agent

And for more examples and details about creating a Grafana Agent YAML configuration file, see Create a config file

Step 3. Confirm logs are being ingested into Grafana Cloud

Within several minutes, logs should begin to be available in Grafana Cloud. To test this, use the Explore feature.

Click the compass Explore icon from the left sidebar menu to start. This takes you to the Explore page.

At the top of the page, use the dropdown menu to select your Loki logs data source. This should be named grafanacloud-$yourstackname-logs.

The image below shows the Log browser dropdown to find the labels for logs being ingested to your Grafana Cloud environment.

Explore_Log_Browser_labels

If no log labels appear, logs are not being collected. If labels are listed, this confirms that logs are being received.

If logs are not displayed after several minutes, ensure the agent is running and check your steps for typos.

In addition to the Log browser dropdown, the Explore user interface also supports autocomplete options:

Explore_Log_Browser_autocomplete

Below is another example of other operators and parsers available. For more details about querying log data, see LogQL: Log query language

Explore_Log_Browser_autocomplete_more

Querying logs and creating panels

Once you have Grafana Agent up and running on your log source, give it some time to start collecting logs. Eventually, you will be able to query logs and create panels inside dashboards using Loki as a datasource.

Querying logs is done using LogQL which can be used in both Explore and when creating dashboard panels.

For examples and feature showcases, check out play.grafana.org for ideas and inspiration.