Menu
Grafana Cloud

GCP Logs integration for Grafana Cloud

Move your logs from GCP to Grafana Cloud using GCP Pub/Sub and the Grafana Agent

This integration includes 1 pre-built dashboard to help monitor and visualize GCP Logs metrics and logs.

Grafana Alloy configuration

Before you begin

The GCP Logs integration is based on configuring Grafana Alloy to pull logs from a GCP Pub/Sub topic which contains your logs from GCP Cloud Monitoring. The logs are routed from cloud monitoring to pub/sub via customizable logging sink.

You will need to have the gcloud CLI installed and ensure you have roles pubsub.editor, and logging.configWriter to complete this setup.

Setup Pubsub Topic

Google Pubsub Topic will act as the queue to persist log messages which then can be read from the agent.

bash
$ gcloud pubsub topics create $TOPIC_ID

e.g:

bash
$ gcloud pubsub topics create cloud-logs

Setup Log Sink

Next we create log sink to forward logs into pubsub topic created previously

bash
$ gcloud logging sinks create $SINK_NAME $SINK_LOCATION $OPTIONAL_FLAGS

e.g:

bash
$ gcloud logging sinks create cloud-logs pubsub.googleapis.com/projects/my-project/topics/cloud-logs \
--log-filter='resource.type=("gcs_bucket")' \
--description="Cloud logs"

Above command also adds log-filter option which represents what type of logs should get into the destination pubsub topic. For more information on adding log-filter refer this document

Grant log sink the pubsub publisher role

Find the writer identity service account of the log sink just created:

bash
gcloud logging sinks describe \
 --format='value(writerIdentity)' $SINK_NAME

For example:

bash
gcloud logging sinks describe \
 --format='value(writerIdentity)' cloud-logs

Create an IAM policy binding to allow log sink to publish messages to the topic:

bash
gcloud pubsub topics add-iam-policy-binding $TOPIC_ID \
--member=$WRITER_IDENTITY --role=roles/pubsub.publisher

For example:

bash
gcloud pubsub topics add-iam-policy-binding cloud-logs \
--member=serviceAccount:pxxxxxxxxx-xxxxxx@gcp-sa-logging.iam.gserviceaccount.com --role=roles/pubsub.publisher

Create Pubsub subscription for Log Delivery

A subscription will be created for the pubsub topic configured above and the agent will use this subscription to consume log messages.

bash
$ gcloud pubsub subscriptions create cloud-logs --topic=$TOPIC_ID \
--ack-deadline=$ACK_DEADLINE \
--message-retention-duration=$RETENTION_DURATION \

e.g:

bash
$ gcloud pubsub subscriptions create cloud-logs --topic=projects/my-project/topics/cloud-logs \
--ack-deadline=10 \
--message-retention-duration=7d

For more fine grained options, refer to the gcloud pubsub subscriptions --help

Setup using Terraform

You also have the option of creating the required resources for GCP Log collection with terraform.

How to use Terraform is outside the scope of this guide. You can find this tutorial on how to work with Terraform and GCP useful.

terraform
// Provider module
provider "google" {
  project = "$GCP_PROJECT_ID"
}

// Topic
resource "google_pubsub_topic" "main" {
  name = "cloud-logs"
}

// Log sink
variable "inclusion_filter" {
  type        = string
  description = "Optional GCP Logs query which can filter logs being routed to the pub/sub topic and promtail"
}

resource "google_logging_project_sink" "main" {
  name                   = "cloud-logs"
  destination            = "pubsub.googleapis.com/${google_pubsub_topic.main.id}"
  filter                 = var.inclusion_filter
  unique_writer_identity = true
}

resource "google_pubsub_topic_iam_binding" "log-writer" {
  topic = google_pubsub_topic.main.name
  role  = "roles/pubsub.publisher"
  members = [
    google_logging_project_sink.main.writer_identity,
  ]
}

// Subscription
resource "google_pubsub_subscription" "main" {
  name  = "cloud-logs"
  topic = google_pubsub_topic.main.name
}

Then, to create the new resources run the snippet below after filling in the required variables.

bash
terraform apply \
    -var="inclusion_filter=<GCP Logs query of what logs to include>"

ServiceAccount for the agent

The agent will need to run with a service account that has the pubsub.subscriber permission.

This will enable the agent to read log entries from the pubsub subscription created before.

Install GCP Logs integration for Grafana Cloud

  1. In your Grafana Cloud stack, click Connections in the left-hand menu.
  2. Find GCP Logs and click its tile to open the integration.
  3. Review the prerequisites in the Configuration Details tab and set up Grafana Agent to send GCP Logs metrics and logs to your Grafana Cloud instance.
  4. Click Install to add this integration’s pre-built dashboard to your Grafana Cloud instance, and you can start monitoring your GCP Logs setup.

Configuration snippets for Grafana Alloy

Advanced mode

The following snippets provide examples to guide you through the configuration process.

To instruct Grafana Alloy to scrape your GCP Logs instances, copy and paste the snippets to your configuration file and follow subsequent instructions.

Advanced logs snippets

darwin

river
discovery.relabel "logs_integrations_integrations_gcp" {
	targets = []

	rule {
		source_labels = ["__gcp_logname"]
		target_label  = "logname"
	}

	rule {
		source_labels = ["__gcp_resource_type"]
		target_label  = "resource_type"
	}
}

loki.source.gcplog "logs_integrations_integrations_gcp" {
	pull {
		project_id   = "<gcp_project_id>"
		subscription = "<gcp_pubsub_subscription_name>"
		labels       = {
			job = "integrations/gcp",
		}
	}
	forward_to    = [loki.write.grafana_cloud_loki.receiver]
	relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules
}

To monitor your GCP Logs, you will use a combination of the following components:

  • discovery.relabel defines any relabeling needed before sending logs to Loki.
  • loki.source.gcplog retrieves logs from cloud resources such as GCS buckets, load balancers, or Kubernetes clusters running on GCP by making use of Pub/Sub subscriptions.

Find the placeholder <gcp_project_id> and <gcp_pubsub_subscription_name> names and replace them with the GCP project id and subscription name which was created in the pre-install steps.

Advanced configuration

The loki.source.gcplog component can be modified further for more advanced usages. Some examples of those advanced scenarios are below.

Multiple Consumers Single Subscription

river
loki.source.gcplog "logs_integrations_integrations_gcp" {
    pull {
        project_id   = "project-1"
        subscription = "subscription-1"
        labels       = {
            job = "integrations/gcp",
        }
    }
    forward_to    = [loki.write.grafana_cloud_loki.receiver]
    relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules    
}
loki.source.gcplog "logs_integrations_integrations_gcp" {
    pull {
        project_id   = "project-1"
        subscription = "subscription-1"
        labels       = {
            job = "integrations/gcp2",
        }
    }
    forward_to    = [loki.write.grafana_cloud_loki.receiver]
    relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules        
}

Multiple Subscriptions

river
loki.source.gcplog "logs_integrations_integrations_gcp" {
    pull {
        project_id   = "project-1"
        subscription = "subscription-2"
        labels       = {
            job = "integrations/gcp",
        }
    }
    forward_to    = [loki.write.grafana_cloud_loki.receiver]
    relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules    
}
loki.source.gcplog "logs_integrations_integrations_gcp" {
    pull {
        project_id   = "project-1"
        subscription = "subscription-1"
        labels       = {
            job = "integrations/gcp2",
        }
    }
    forward_to    = [loki.write.grafana_cloud_loki.receiver]
    relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules        
}

linux

river
discovery.relabel "logs_integrations_integrations_gcp" {
	targets = []

	rule {
		source_labels = ["__gcp_logname"]
		target_label  = "logname"
	}

	rule {
		source_labels = ["__gcp_resource_type"]
		target_label  = "resource_type"
	}
}

loki.source.gcplog "logs_integrations_integrations_gcp" {
	pull {
		project_id   = "<gcp_project_id>"
		subscription = "<gcp_pubsub_subscription_name>"
		labels       = {
			job = "integrations/gcp",
		}
	}
	forward_to    = [loki.write.grafana_cloud_loki.receiver]
	relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules
}

To monitor your GCP Logs, you will use a combination of the following components:

  • discovery.relabel defines any relabeling needed before sending logs to Loki.
  • loki.source.gcplog retrieves logs from cloud resources such as GCS buckets, load balancers, or Kubernetes clusters running on GCP by making use of Pub/Sub subscriptions.

Find the placeholder <gcp_project_id> and <gcp_pubsub_subscription_name> names and replace them with the GCP project id and subscription name which was created in the pre-install steps.

Advanced configuration

The loki.source.gcplog component can be modified further for more advanced usages. Some examples of those advanced scenarios are below.

Multiple Consumers Single Subscription

river
loki.source.gcplog "logs_integrations_integrations_gcp" {
    pull {
        project_id   = "project-1"
        subscription = "subscription-1"
        labels       = {
            job = "integrations/gcp",
        }
    }
    forward_to    = [loki.write.grafana_cloud_loki.receiver]
    relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules    
}
loki.source.gcplog "logs_integrations_integrations_gcp" {
    pull {
        project_id   = "project-1"
        subscription = "subscription-1"
        labels       = {
            job = "integrations/gcp2",
        }
    }
    forward_to    = [loki.write.grafana_cloud_loki.receiver]
    relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules        
}

Multiple Subscriptions

river
loki.source.gcplog "logs_integrations_integrations_gcp" {
    pull {
        project_id   = "project-1"
        subscription = "subscription-2"
        labels       = {
            job = "integrations/gcp",
        }
    }
    forward_to    = [loki.write.grafana_cloud_loki.receiver]
    relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules    
}
loki.source.gcplog "logs_integrations_integrations_gcp" {
    pull {
        project_id   = "project-1"
        subscription = "subscription-1"
        labels       = {
            job = "integrations/gcp2",
        }
    }
    forward_to    = [loki.write.grafana_cloud_loki.receiver]
    relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules        
}

windows

river
discovery.relabel "logs_integrations_integrations_gcp" {
	targets = []

	rule {
		source_labels = ["__gcp_logname"]
		target_label  = "logname"
	}

	rule {
		source_labels = ["__gcp_resource_type"]
		target_label  = "resource_type"
	}
}

loki.source.gcplog "logs_integrations_integrations_gcp" {
	pull {
		project_id   = "<gcp_project_id>"
		subscription = "<gcp_pubsub_subscription_name>"
		labels       = {
			job = "integrations/gcp",
		}
	}
	forward_to    = [loki.write.grafana_cloud_loki.receiver]
	relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules
}

To monitor your GCP Logs, you will use a combination of the following components:

  • discovery.relabel defines any relabeling needed before sending logs to Loki.
  • loki.source.gcplog retrieves logs from cloud resources such as GCS buckets, load balancers, or Kubernetes clusters running on GCP by making use of Pub/Sub subscriptions.

Find the placeholder <gcp_project_id> and <gcp_pubsub_subscription_name> names and replace them with the GCP project id and subscription name which was created in the pre-install steps.

Advanced configuration

The loki.source.gcplog component can be modified further for more advanced usages. Some examples of those advanced scenarios are below.

Multiple Consumers Single Subscription

river
loki.source.gcplog "logs_integrations_integrations_gcp" {
    pull {
        project_id   = "project-1"
        subscription = "subscription-1"
        labels       = {
            job = "integrations/gcp",
        }
    }
    forward_to    = [loki.write.grafana_cloud_loki.receiver]
    relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules    
}
loki.source.gcplog "logs_integrations_integrations_gcp" {
    pull {
        project_id   = "project-1"
        subscription = "subscription-1"
        labels       = {
            job = "integrations/gcp2",
        }
    }
    forward_to    = [loki.write.grafana_cloud_loki.receiver]
    relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules        
}

Multiple Subscriptions

river
loki.source.gcplog "logs_integrations_integrations_gcp" {
    pull {
        project_id   = "project-1"
        subscription = "subscription-2"
        labels       = {
            job = "integrations/gcp",
        }
    }
    forward_to    = [loki.write.grafana_cloud_loki.receiver]
    relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules    
}
loki.source.gcplog "logs_integrations_integrations_gcp" {
    pull {
        project_id   = "project-1"
        subscription = "subscription-1"
        labels       = {
            job = "integrations/gcp2",
        }
    }
    forward_to    = [loki.write.grafana_cloud_loki.receiver]
    relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules        
}
Grafana Agent configuration

Before you begin

The GCP Logs integration is based on configuring the Grafana Agent to pull logs from a GCP Pub/Sub topic which contains your logs from GCP Cloud Monitoring. The logs are routed from cloud monitoring to pub/sub via customizable logging sink.

You will need to have the gcloud CLI installed and ensure you have roles pubsub.editor, and logging.configWriter to complete this setup

Setup Pubsub Topic

Google Pubsub Topic will act as the queue to persist log messages which then can be read from the agent.

bash
$ gcloud pubsub topics create $TOPIC_ID

e.g:

bash
$ gcloud pubsub topics create cloud-logs

Setup Log Sink

Next we create log sink to forward logs into pubsub topic created previously

bash
$ gcloud logging sinks create $SINK_NAME $SINK_LOCATION $OPTIONAL_FLAGS

e.g:

bash
$ gcloud logging sinks create cloud-logs pubsub.googleapis.com/projects/my-project/topics/cloud-logs \
--log-filter='resource.type=("gcs_bucket")' \
--description="Cloud logs"

Above command also adds log-filter option which represents what type of logs should get into the destination pubsub topic. For more information on adding log-filter refer this document

Grant log sink the pubsub publisher role

Find the writer identity service account of the log sink just created:

bash
gcloud logging sinks describe \
 --format='value(writerIdentity)' $SINK_NAME

For example:

bash
gcloud logging sinks describe \
 --format='value(writerIdentity)' cloud-logs

Create an IAM policy binding to allow log sink to publish messages to the topic:

bash
gcloud pubsub topics add-iam-policy-binding $TOPIC_ID \
--member=$WRITER_IDENTITY --role=roles/pubsub.publisher

For example:

bash
gcloud pubsub topics add-iam-policy-binding cloud-logs \
--member=serviceAccount:pxxxxxxxxx-xxxxxx@gcp-sa-logging.iam.gserviceaccount.com --role=roles/pubsub.publisher

Create Pubsub subscription for Log Delivery

A subscription will be created for the pubsub topic configured above and the agent will use this subscription to consume log messages.

bash
$ gcloud pubsub subscriptions create cloud-logs --topic=$TOPIC_ID \
--ack-deadline=$ACK_DEADLINE \
--message-retention-duration=$RETENTION_DURATION \

e.g:

bash
$ gcloud pubsub subscriptions create cloud-logs --topic=projects/my-project/topics/cloud-logs \
--ack-deadline=10 \
--message-retention-duration=7d

For more fine grained options, refer to the gcloud pubsub subscriptions --help

Setup using Terraform

You also have the option of creating the required resources for GCP Log collection with terraform.

How to use Terraform is outside the scope of this guide. You can find this tutorial on how to work with Terraform and GCP useful.

terraform
// Provider module
provider "google" {
  project = "$GCP_PROJECT_ID"
}

// Topic
resource "google_pubsub_topic" "main" {
  name = "cloud-logs"
}

// Log sink
variable "inclusion_filter" {
  type        = string
  description = "Optional GCP Logs query which can filter logs being routed to the pub/sub topic and promtail"
}

resource "google_logging_project_sink" "main" {
  name                   = "cloud-logs"
  destination            = "pubsub.googleapis.com/${google_pubsub_topic.main.id}"
  filter                 = var.inclusion_filter
  unique_writer_identity = true
}

resource "google_pubsub_topic_iam_binding" "log-writer" {
  topic = google_pubsub_topic.main.name
  role  = "roles/pubsub.publisher"
  members = [
    google_logging_project_sink.main.writer_identity,
  ]
}

// Subscription
resource "google_pubsub_subscription" "main" {
  name  = "cloud-logs"
  topic = google_pubsub_topic.main.name
}

Then, to create the new resources run the snippet below after filling in the required variables.

bash
terraform apply \
    -var="inclusion_filter=<GCP Logs query of what logs to include>"

ServiceAccount for the agent

The agent will need to run with a service account that has the pubsub.subscriber permission.

This will enable the agent to read log entries from the pubsub subscription created before.

Install GCP Logs integration for Grafana Cloud

  1. In your Grafana Cloud stack, click Connections in the left-hand menu.
  2. Find GCP Logs and click its tile to open the integration.
  3. Review the prerequisites in the Configuration Details tab and set up Grafana Agent to send GCP Logs metrics and logs to your Grafana Cloud instance.
  4. Click Install to add this integration’s pre-built dashboard to your Grafana Cloud instance, and you can start monitoring your GCP Logs setup.

Post-install configuration for the GCP Logs integration

In the agent configuration file, find the placeholder <gcp_project_id> and <gcp_pubsub_subscription_name> names and replace them with the GCP project id and subscription name which was created in the pre-install steps.

Advanced configuration

The integration configuration found at logs.configs.scrape_configs can be modified further for more advanced usages. Some examples of those advanced scenarios are below.

Multiple Consumers Single Subscription

yaml
- gcplog:
    labels:
      job: integrations/gcp
    project_id: project-1
    subscription: subscription-1
    subscription_type: pull
    use_incoming_timestamp: false
    job_name: integrations/gcp
    relabel_configs:
      - action: replace
        source_labels:
          - __gcp_logname
        target_label: logname
      - action: replace
        source_labels:
          - __gcp_resource_type
        target_label: resource_type
- gcplog:
    labels:
      job: integrations/gcp
    project_id: project-1
    subscription: subscription-1
    subscription_type: pull
    use_incoming_timestamp: false
  job_name: integrations/gcp-2
  relabel_configs:
    - action: replace
      source_labels:
        - __gcp_logname
      target_label: logname
    - action: replace
      source_labels:
        - __gcp_resource_type
      target_label: resource_type

Multiple Subscriptions

yaml
- gcplog:
    labels:
      job: integrations/gcp
    project_id: project-1
    subscription: subscription-1
    subscription_type: pull
    use_incoming_timestamp: false
  job_name: integrations/gcp
  relabel_configs:
    - action: replace
      source_labels:
        - __gcp_logname
      target_label: logname
    - action: replace
      source_labels:
        - __gcp_resource_type
      target_label: resource_type
- gcplog:
    labels:
      job: integrations/gcp
    project_id: project-1
    subscription: subscription-2
    subscription_type: pull
    use_incoming_timestamp: false
  job_name: integrations/gcp-2
  relabel_configs:
    - action: replace
      source_labels:
        - __gcp_logname
      target_label: logname
    - action: replace
      source_labels:
        - __gcp_resource_type
      target_label: resource_type

Configuration snippets for Grafana Agent

Below logs.configs.scrape_configs, insert the following lines according to your environment.

yaml
    - gcplog: 
        subscription_type: pull
        project_id: <gcp_project_id>
        subscription: <gcp_pubsub_subscription_name>
        labels:
          job: integrations/gcp
        use_incoming_timestamp: false
      job_name: integrations/gcp
      relabel_configs:
        - action: replace
          source_labels:
            - __gcp_logname
          target_label: logname
        - action: replace
          source_labels:
            - __gcp_resource_type
          target_label: resource_type

Full example configuration for Grafana Agent

Refer to the following Grafana Agent configuration for a complete example that contains all the snippets used for the GCP Logs integration. This example also includes metrics that are sent to monitor your Grafana Agent instance.

yaml
integrations:
  prometheus_remote_write:
  - basic_auth:
      password: <your_prom_pass>
      username: <your_prom_user>
    url: <your_prom_url>
  agent:
    enabled: true
    relabel_configs:
    - action: replace
      source_labels:
      - agent_hostname
      target_label: instance
    - action: replace
      target_label: job
      replacement: "integrations/agent-check"
    metric_relabel_configs:
    - action: keep
      regex: (prometheus_target_sync_length_seconds_sum|prometheus_target_scrapes_.*|prometheus_target_interval.*|prometheus_sd_discovered_targets|agent_build.*|agent_wal_samples_appended_total|process_start_time_seconds)
      source_labels:
      - __name__
  # Add here any snippet that belongs to the `integrations` section.
  # For a correct indentation, paste snippets copied from Grafana Cloud at the beginning of the line.
logs:
  configs:
  - clients:
    - basic_auth:
        password: <your_loki_pass>
        username: <your_loki_user>
      url: <your_loki_url>
    name: integrations
    positions:
      filename: /tmp/positions.yaml
    scrape_configs:
      # Add here any snippet that belongs to the `logs.configs.scrape_configs` section.
      # For a correct indentation, paste snippets copied from Grafana Cloud at the beginning of the line.
    - gcplog: 
        subscription_type: pull
        project_id: <gcp_project_id>
        subscription: <gcp_pubsub_subscription_name>
        labels:
          job: integrations/gcp
        use_incoming_timestamp: false
      job_name: integrations/gcp
      relabel_configs:
        - action: replace
          source_labels:
            - __gcp_logname
          target_label: logname
        - action: replace
          source_labels:
            - __gcp_resource_type
          target_label: resource_type
metrics:
  configs:
  - name: integrations
    remote_write:
    - basic_auth:
        password: <your_prom_pass>
        username: <your_prom_user>
      url: <your_prom_url>
    scrape_configs:
      # Add here any snippet that belongs to the `metrics.configs.scrape_configs` section.
      # For a correct indentation, paste snippets copied from Grafana Cloud at the beginning of the line.
  global:
    scrape_interval: 60s
  wal_directory: /tmp/grafana-agent-wal

Dashboards

The GCP Logs integration installs the following dashboards in your Grafana Cloud instance to help monitor your system.

  • GCP Logs Overview

Overview

Overview

Changelog

md
# 0.0.3 - January 2024

* Add an overview dashboard
* Update suggested job name

# 0.0.2 - August 2023

* Add regex filter for logs datasource

# 0.0.1 - December 2022

* Initial Release

Cost

By connecting your GCP Logs instance to Grafana Cloud, you might incur charges. To view information on the number of active series that your Grafana Cloud account uses for metrics included in each Cloud tier, see Active series and dpm usage and Cloud tier pricing.