Menu
Grafana Cloud

Troubleshooting applications using the Java agent

If Application Observability doesn’t list your services, these are some of the most common causes:

No traffic

Make a few requests to the service to send data to Grafana Cloud.

Be patient

It can take up to 5 minutes until the data is visible in Application Observability.

Look for errors

Look for errors - either on the console or in Docker or Kubernetes logs, using Application Observability logs doesn’t make sense in this case.

If there are errors sending telemetry data, one of the parameters is usually wrong.

A 5xx response code means that there’s something wrong with the Grafana Cloud OTLP Gateway.

Log agent data

If there are no errors in the logs, enable logging to check the application is sending data.

If the application isn’t sending data, look for the -javaagent command line parameter to check if you loaded the Java agent.

Log exporter data

If the application is sending data to an data collector, Grafana Alloy or OpenTelemetry Collector, log the data from the data collector and check there are no errors forwarding the telemetry data to Grafana Cloud.

Java agent logging

To inspect the internals of the Java agent, enable the Java agent’s internal debug logging:

shell
export OTEL_JAVAAGENT_DEBUG=true

Note

Only enable the Java agent debug logging when needed. The logs are extremely verbose and impact application performance.

Disable Java agent

Lastly, disable the Java agent to check the application runs successfully without instrumentation:

shell
export OTEL_JAVAAGENT_ENABLED=false

OTLP debug logging

You can configure the OTLP debug logging can via environment variables, previous versions use logging instead of console:

ConfigurationResult
OTEL_SPAN_EXPORTER=otlp,consoleLog all spans, and also send spans to Loki.
OTEL_METRICS_EXPORTER=otlp,consoleLog all metrics, and also send metrics to Loki.
OTEL_LOGS_EXPORTER=otlp,consoleLog all logs.

Note

You should disable debug logging when it isn’t needed as it produces many logs.

This produce log output like this:

Spans - look for LoggingSpanExporter:

log
INFO io.opentelemetry.exporter.logging.LoggingSpanExporter - 'WebController.withSpan' : 2337335133908c9ce6e0dfc7bda74d7c 8bfef4eaac83e8cb INTERNAL [tracer: io.opentelemetry.opentelemetry-extension-annotations-1.0:1.32.0-alpha] AttributesMap{data={thread.id=32, code.namespace=io.opentelemetry.smoketest.springboot.controller.WebController, code.function=withSpan, thread.name=http-nio-8080-exec-1}, capacity=128, totalAddedValues=4}

Metrics - look for LoggingMetricExporter:

log
INFO io.opentelemetry.exporter.logging.LoggingMetricExporter - metric: ImmutableMetricData{resource=Resource{schemaUrl=https://opentelemetry.io/schemas/1.21.0, attributes={container.id="048b9982e0b98cdc5579334bb1decc157ed1ebc23f391ebe306898898ec32fa4", host.arch="amd64", host.name="048b9982e0b9", os.description="Linux 6.2.0-39-generic", os.type="linux", process.command_line="/usr/lib/jvm/jdk-8u312-bellsoft-x86_64/jre/bin/java -javaagent:/opentelemetry-javaagent.jar -Dgrafana.otel.use-tested-instrumentations=true io.opentelemetry.smoketest.springboot.SpringbootApplication", process.executable.path="/usr/lib/jvm/jdk-8u312-bellsoft-x86_64/jre/bin/java", process.pid=1, process.runtime.description="BellSoft OpenJDK 64-Bit Server VM 25.312-b07", process.runtime.name="OpenJDK Runtime Environment", process.runtime.version="1.8.0_312-b07", service.instance.id="8231ca95-e9aa-474a-bd98-88349a9942ad", service.name="unknown_service:java", telemetry.auto.version="1.32.0", telemetry.distro.name="grafana-opentelemetry-java", telemetry.distro.version="0.32.0-beta.1", telemetry.sdk.language="java", telemetry.sdk.name="opentelemetry", telemetry.sdk.version="1.32.0"}}, instrumentationScopeInfo=InstrumentationScopeInfo{name=io.opentelemetry.runtime-telemetry-java8, version=1.32.0-alpha, schemaUrl=null, attributes={}}, name=jvm.cpu.count, description=Number of processors available to the Java virtual machine., unit={cpu}, type=LONG_SUM, data=ImmutableSumData{points=[ImmutableLongPointData{startEpochNanos=1704964347622000000, epochNanos=1704964351627000000, attributes={}, value=12, exemplars=[]}], monotonic=false, aggregationTemporality=CUMULATIVE}}

Logs - look for [scopeInfo: and duplicated log body. The second line is for reference to see that it also contains HTTP request received:

log
10:12:34.031 [docker-java-stream-636643381] INFO  c.g.extensions.smoketest.SmokeTest - STDOUT: 2024-01-11T09:12:34.03Z INFO 'HTTP request received' : 2337335133908c9ce6e0dfc7bda74d7c 50d689015fd0a33c [scopeInfo: io.opentelemetry.smoketest.springboot.controller.WebController:] {thread.id=32, thread.name="http-nio-8080-exec-1"}
10:12:34.031 [docker-java-stream-636643381] INFO  c.g.extensions.smoketest.SmokeTest - STDOUT: INFO  [http-nio-8080-exec-1] io.opentelemetry.smoketest.springboot.controller.WebController: HTTP request received trace_id=