Menu
Grafana Cloud

Configure GCP Logs

Complete the following steps to configure GCP Logs as shown in the following diagram.

Installation steps for GCP Logs
Installation steps for GCP Logs

Select your platform

Select the platform from the drop-down menu.

Install Grafana Alloy

Alloy reads the logs from the Pub/Sub subscription.

  1. If you have not already installed Alloy where you intend to run GCP Logs, click Run Grafana Alloy.

  2. At the Alloy configuration screen, enter a token name. The token displays on the screen and is added to the command for running Alloy.

  3. Copy the command and paste it into the terminal.

  4. Click Proceed to install integration.

Set up GCP resources

Create the required resources for GCP Log collection. A customizable logging sink routes logs from Cloud Monitoring to Pub/Sub. Grafana Alloy then pulls these logs from the Pub/Sub topic.

You can configure GCP Log collection with gcloud CLI or with Terraform.

Configure with gcloud CLI

Ensure you have these prerequisites before setup:

  • Installed gcloud CLI
  • Roles for pubsub.editor and logging.configWriter

Complete these steps in the glcoud CLI.

StepCommand/command example
1. Set up topic in Pub/Sub.
This topic acts as the queue to persist log messages so Grafana Alloy can read them.
$ gcloud pubsub topics create $TOPIC_ID
Example: $ gcloud pubsub topics create cloud-logs
2. Set up a log sink to forward logs into the topic you created in the previous step.
This command adds a log-filter option, which controls the type of logs that reaches the destination topic.
$ gcloud logging sinks create $SINK_NAME $SINK_LOCATION $OPTIONAL_FLAGS
Example:
$ gcloud logging sinks create cloud-logs pubsub.googleapis.com/projects/my-project/topics/cloud-logs \--log-filter='resource.type=("gcs_bucket")' \--description="Cloud logs"
3. Find the writer identity service account of the log sink.gcloud logging sinks describe \ --format='value(writerIdentity)' $SINK_NAME
Example:
gcloud logging sinks describe \ --format='value(writerIdentity)' cloud-logs
4. Grant a Pub/Sub publisher role to the log sink by creating an IAM policy binding. This allows the log sink to publish messages to the topic.gcloud pubsub topics add-iam-policy-binding $TOPIC_ID \ --member=$WRITER_IDENTITY --role=roles/pubsub.publisher
Example:
gcloud pubsub topics add-iam-policy-binding cloud-logs --member=serviceAccount:pxxxxxxxxx-xxxxxx@gcp-sa-logging.iam.gserviceaccount.com --role=roles/pubsub.publisher
5. Create a Pub/Sub subscription for log delivery. Alloy uses the subscription to consume log messages.
For more options, refer to gcloud pubsub subscriptions --help.
$ gcloud pubsub subscriptions create cloud-logs --topic=$TOPIC_ID \ --ack-deadline=$ACK_DEADLINE \ --message-retention-duration=$RETENTION_DURATION \
Example:
$ gcloud pubsub subscriptions create cloud-logs --topic=projects/my-project/topics/cloud-logs \ --ack-deadline=10 \ --message-retention-duration=7d

Configure with Terraform

If you need help working with Terraform and GCP, refer to this tutorial.

  1. Copy the following into a TF file.

    // Provider module
    provider "google" {
    project = "$GCP_PROJECT_ID"
    }
    
    // Topic
    resource "google_pubsub_topic" "main" {
    name = "cloud-logs"
    }
    
    // Log sink
    variable "inclusion_filter" {
    type        = string
    description = "Optional GCP Logs query which can filter logs being routed to the pub/sub topic and promtail"
    }
    
    resource "google_logging_project_sink" "main" {
    name                   = "cloud-logs"
    destination            = "pubsub.googleapis.com/${google_pubsub_topic.main.id}"
    filter                 = var.inclusion_filter
    unique_writer_identity = true
    }
    
    resource "google_pubsub_topic_iam_binding" "log-writer" {
    topic = google_pubsub_topic.main.name
    role  = "roles/pubsub.publisher"
    members = [
        google_logging_project_sink.main.writer_identity,
    ]
    }
    
    // Subscription
    resource "google_pubsub_subscription" "main" {
    name  = "cloud-logs"
    topic = google_pubsub_topic.main.name
    }
  2. Create the new resources by running the following after replacing the required variable <GCP Logs query of what logs to include>.

    terraform apply \
        -var="inclusion_filter=<GCP Logs query of what logs to include>"

Set up GCP service account

In the glcoud CLI, create a service account with the pubsub.subscriber permission.

The service account and its permission enables Grafana Alloy to read log entries from the Pub/Sub subscription.

Configure Alloy to consume GCP Pub/Sub

Configure Grafana Alloy to scrape logs from GCP Pub/Sub.

  1. Navigate to the configuration file for your Alloy instance.

  2. Copy the following and append it to your Alloy configuration file.

    discovery.relabel "logs_integrations_integrations_gcp" {
        targets = []
    
        rule {
            source_labels = ["__gcp_logname"]
            target_label  = "logname"
        }
    
        rule {
            source_labels = ["__gcp_resource_type"]
            target_label  = "resource_type"
        }
    }
    
    loki.source.gcplog "logs_integrations_integrations_gcp" {
        pull {
            project_id   = "<gcp_project_id>"
            subscription = "<gcp_pubsub_subscription_name>"
            labels       = {
                job = "integrations/gcp",
            }
        }
        forward_to    = [loki.write.grafana_cloud_loki.receiver]
        relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules
    }
  3. Find the following placeholders and replace them as follows with values you created when you set up GCP Logs:

    • gcp_project_id: replace with the GCP project ID
    • gcp_pubsub_subscription_name: replace with the subscription name

    discovery.relabel defines any relabeling needed before sending logs to Loki. loki.source.gcplog retrieves logs from cloud resources such as GCS buckets, load balancers, or Kubernetes clusters running on GCP by making use of Pub/Sub subscriptions.

Examples of advanced configuration

You can further modify the loki.source.gcplog component for more advanced usages.

Multiple consumers, single subscription
loki.source.gcplog "logs_integrations_integrations_gcp" {
    pull {
        project_id   = "project-1"
        subscription = "subscription-1"
        labels       = {
            job = "integrations/gcp",
        }
    }
    forward_to    = [loki.write.grafana_cloud_loki.receiver]
    relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules
}
loki.source.gcplog "logs_integrations_integrations_gcp" {
    pull {
        project_id   = "project-1"
        subscription = "subscription-1"
        labels       = {
            job = "integrations/gcp2",
        }
    }
    forward_to    = [loki.write.grafana_cloud_loki.receiver]
    relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules
}
Multiple subscriptions
loki.source.gcplog "logs_integrations_integrations_gcp" {
    pull {
        project_id   = "project-1"
        subscription = "subscription-2"
        labels       = {
            job = "integrations/gcp",
        }
    }
    forward_to    = [loki.write.grafana_cloud_loki.receiver]
    relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules
}
loki.source.gcplog "logs_integrations_integrations_gcp" {
    pull {
        project_id   = "project-1"
        subscription = "subscription-1"
        labels       = {
            job = "integrations/gcp2",
        }
    }
    forward_to    = [loki.write.grafana_cloud_loki.receiver]
    relabel_rules = discovery.relabel.logs_integrations_integrations_gcp.rules
}

Restart Grafana Alloy

Run the command appropriate for your platform to restart Grafana Alloy so your changes can take effect.

Test connection

Click Test connection to test that Grafana Alloy is collecting data and sending it to Grafana Cloud.

View your logs

Click the Logs tab to view your logs.