LogsCollect logs with Grafana Agent

Collect logs with Grafana Agent

Loki is the main server, responsible for storing logs and processing queries. Grafana Cloud includes Loki, so you don’t need to perform a Loki installation, you just need configure some settings within Grafana Cloud so that logs are aggregated and stored correctly. This is what enables log storage, which powers both visualization and querying.

Promtail is the agent, developed by the Loki team and with releases that correspond to Loki releases. The Grafana Agent includes Promtail capabilities.

Collecting logs into Grafana Cloud is a two-step process. You must:

  • Configure Grafana Cloud
  • Install and configure a way to send logs; here are the options in this page:
    • Configure the Grafana Agent to send logs from a single node
    • Configure the Grafana Agent to send logs from a Kubernetes cluster

If you want to use Promtail instead of the Grafana Agent to send logs, see Collect logs with Promtail.

Step 1. Configure Grafana Cloud to receive the logs

To begin, Create a Grafana Cloud API key with the MetricsPublisher role. Save this information as you will need the API key in a later step.

Open Grafana Cloud. In the side menu, from Settings (looks like a gear) select Data Sources.

On the Configuration page that opens, in the Data Sources tab (which you should already be in), click Add data source.

From the list of options, select Loki. Keep this open in a browser tab.

Open a different browser tab and open Grafana Cloud. In this tab:

  1. In the side menu, from Onboarding (looks like a lightning bolt) select Walkthrough.
  2. Find and select Loki, scroll down and click Next: Configure service.
  3. Follow the directions in the UI to create an appropriate API key and configure your system. Click Finish configuration.

Go back to the previous tab where we were configuring the Loki data source. Enter the information you found for Name, URL, User, the API key you created earlier, and check the Basic Auth box.

Save and move to Step 2.

Step 2. Configure the Grafana Agent to send logs

This option covers configuring the Grafana Agent to send logs and assumes you are collecting logs from a Linux host.

There are two options listed here for configuring the Grafana Agent:

  • Install and send logs from a single node
  • Install and send logs from a Kubernetes cluster

Option A. Install the agent on a single node and send logs

The contents of this section come from the Gathering logs from a Linux host using the Grafana Agent quickstart.

Because your Linux machine is already running the agent, configuring it to send logs along with whatever metrics it is already sending is accomplished by modifying the agent configuration YAML file.

The agent configuration is stored in /etc/grafana-agent.yaml. Open the file and add this new section, below the Prometheus section (if it exists) and the Integrations section (created when you installed an integration). The new section should start at the root-level of indentation (all the way at the left margin in the file).

Use these contents, with your newly-created API key from Step 1 replacing <Your Grafana.com API Key>and <User> with the user number you found while creating the Loki data source in Grafana Cloud. The URL in our sample is for most US-based customers. Yours may differ. Use the URL you found while configuring your Grafana Cloud in the previous section.

  - name: default
      filename: /tmp/positions.yaml
      - job_name: varlogs
          - targets: [localhost]
              job: varlogs
              __path__: /var/log/*log
      - url: http://logs-prod-us-central1.grafana.net/loki/api/v1/push
          username: <User>
          password: <Your Grafana.com API Key>

This example will scrape and send info from all logs in /var/log that end in log. They are labeled with varlogs as the job and job_name.

NOTE: Read Loki label best practices to learn how to use labels effectively for the best experience.

You can add additional sections for logs in other locations or with other filenames. For example, here’s one for dmesg, which you would place in the static_configs: section and before the clients: section.

      - job_name: dmesg
          - targets: [localhost]
              job: dmesg
              __path__: /var/log/dmesg

Here is another example, scraping logs for a minecraft server with logs stored in a subdirectory of the /home directory of a special minecraft user.

      - job_name: minecraftlog
          - targets: [localhost]
              job: minecraft
              __path__: /home/MCuser/minecraft/logs/latest.log

Anytime you change the agent configuration, you must restart the agent for the new configuration to take effect.

sudo systemctl restart grafana-agent.service

Option B. Install the agent and send logs from a Kubernetes cluster

To learn how to roll out the Grafana Agent into a Kubernetes cluster to ship logs, please see the Grafana Agent Logs Kubernetes Quickstart

Querying logs and creating panels

Once you have Promtail up and running on your log source, give it some time to start collecting logs. Eventually, you will be able to query logs and create panels inside dashboards using Loki as a datasource.

Querying logs is done using LogQL which can be used in both Explore and when creating dashboard panels.