Caution
Grafana Alloy is the new name for our distribution of the OTel collector. Grafana Agent has been deprecated and is in Long-Term Support (LTS) through October 31, 2025. Grafana Agent will reach an End-of-Life (EOL) on November 1, 2025. Read more about why we recommend migrating to Grafana Alloy.
Important: This documentation is about an older version. It's relevant only to the release noted, many of the features and functions have been updated or replaced. Please view the current version.
prometheus.scrape
prometheus.scrape
configures a Prometheus scraping job for a given set of
targets
. The scraped metrics are forwarded to the list of receivers passed in
forward_to
.
Multiple prometheus.scrape
components can be specified by giving them
different labels.
Usage
prometheus.scrape "LABEL" {
targets = TARGET_LIST
forward_to = RECEIVER_LIST
}
Arguments
The component configures and starts a new scrape job to scrape all of the input targets. The list of arguments that can be used to configure the block is presented below.
The scrape job name defaults to the component’s unique identifier.
Any omitted fields take on their default values. In case that conflicting attributes are being passed (eg. defining both a BearerToken and BearerTokenFile or configuring both Basic Authorization and OAuth2 at the same time), the component reports an error.
The following arguments are supported:
Blocks
The following blocks are supported inside the definition of prometheus.scrape
:
The >
symbol indicates deeper levels of nesting. For example,
http_client_config > basic_auth
refers to a basic_auth
block defined inside
an http_client_config
block.
http_client_config block
The http_client_config
block configures settings used to connect to
endpoints.
bearer_token
, bearer_token_file
, basic_auth
, authorization
, and
oauth2
are mutually exclusive and only one can be provided inside of a
http_client_config
block.
basic_auth block
password
and password_file
are mututally exclusive and only one can be
provided inside of a basic_auth
block.
authorization block
credential
and credentials_file
are mutually exclusive and only one can be
provided inside of an authorization
block.
oauth2 block
client_secret
and client_secret_file
are mututally exclusive and only one
can be provided inside of an oauth2
block.
The oauth2
block may also contain its own separate tls_config
sub-block.
tls_config block
When min_version
is not provided, the minimum acceptable TLS version is
inherited from Go’s default minimum version, TLS 1.2. If min_version
is
provided, it must be set to one of the following strings:
"TLS10"
(TLS 1.0)"TLS11"
(TLS 1.1)"TLS12"
(TLS 1.2)"TLS13"
(TLS 1.3)
Exported fields
prometheus.scrape
does not export any fields that can be referenced by other
components.
Component health
prometheus.scrape
is only reported as unhealthy if given an invalid
configuration.
Debug information
prometheus.scrape
reports the status of the last scrape for each configured
scrape job on the component’s debug endpoint.
Debug metrics
agent_prometheus_fanout_latency
(histogram): Write latency for sending to direct and indirect components.
Scraping behavior
The prometheus.scrape
component borrows the scraping behavior of Prometheus.
Prometheus, and by extent this component, uses a pull model for scraping
metrics from a given set of targets.
Each scrape target is defined as a set of key-value pairs called labels.
The set of targets can either be static, or dynamically provided periodically
by a service discovery component such as discovery.kubernetes
. The special
label __address__
must always be present and corresponds to the
<host>:<port>
that is used for the scrape request.
By default, the scrape job tries to scrape all available targets’ /metrics
endpoints using HTTP, with a scrape interval of 1 minute and scrape timeout of
10 seconds. The metrics path, protocol scheme, scrape interval and timeout,
query parameters, as well as any other settings can be configured using the
component’s arguments.
The scrape job expects the metrics exposed by the endpoint to follow the
OpenMetrics format. All metrics are then propagated
to each receiver listed in the component’s forward_to
argument.
Labels coming from targets, that start with a double underscore __
are
treated as internal, and are removed prior to scraping.
The prometheus.scrape
component regards a scrape as successful if it
responded with an HTTP 200 OK
status code and returned a body of valid
metrics.
If the scrape request fails, the component’s debug UI section contains more detailed information about the failure, the last successful scrape, as well as the labels last used for scraping.
The following labels are automatically injected to the scraped time series and can help pin down a scrape target.
Similarly, these metrics that record the behavior of the scrape targets are also automatically available.
The up
metric is particularly useful for monitoring and alerting on the
health of a scrape job. It is set to 0
in case anything goes wrong with the
scrape target, either because it is not reachable, because the connection
times out while scraping, or because the samples from the target could not be
processed. When the target is behaving normally, the up
metric is set to
1
.
Example
The following example sets up the scrape job with certain attributes (scrape endpoint, scrape interval, query parameters) and lets it scrape two instances of the blackbox exporter. The exposed metrics are sent over to the provided list of receivers, as defined by other components.
prometheus.scrape "blackbox_scraper" {
targets = [
{"__address__" = "blackbox-exporter:9115", "instance" = "one"},
{"__address__" = "blackbox-exporter:9116", "instance" = "two"},
]
forward_to = [prometheus.remote_write.grafanacloud.receiver, prometheus.remote_write.onprem.receiver]
scrape_interval = "10s"
params = { "target" = ["grafana.com"], "module" = ["http_2xx"] }
metrics_path = "/probe"
}
Here’s the the endpoints that are being scraped every 10 seconds:
http://blackbox-exporter:9115/probe?target=grafana.com&module=http_2xx
http://blackbox-exporter:9116/probe?target=grafana.com&module=http_2xx