AKA ES Nginx Logs

由神翔,青空,睿希完成

AKA ES Nginx Logs screenshot 1

一、介绍

二、Update Logs

2020-04-23

  1. 新增今日访问pv、今日访问uv、7天访问pv图表
  2. 针对k8s ingress nginx 日志已测试ok。
  3. k8s 部署文档正在写

2020-09-30

  1. 反馈导入必须有 Prometheus 现已移除 Prometheus

三、ELK Version

Name7.3.17.6.17.9.1
kibanaokokok
filebeatokokok
logstashokokok
elasticsearchokokok

四、Error

  1. 字段错误
logstash 引索必须是 logstash-* 开头,否则需要修改logstash 才可正常

Nginx 字段

  • 请保证 nginx 使用该字段,名称如果有修改,grafana 模板需要做一定修改
log_format aka_logs
    '{"@timestamp":"$time_iso8601",'
    '"host":"$hostname",'
    '"server_ip":"$server_addr",'
    '"client_ip":"$remote_addr",'
    '"xff":"$http_x_forwarded_for",'
    '"domain":"$host",'
    '"url":"$uri",'
    '"referer":"$http_referer",'
    '"args":"$args",'
    '"upstreamtime":"$upstream_response_time",'
    '"responsetime":"$request_time",'
    '"request_method":"$request_method",'
    '"status":"$status",'
    '"size":"$body_bytes_sent",'
    '"request_body":"$request_body",'
    '"request_length":"$request_length",'
    '"protocol":"$server_protocol",'
    '"upstreamhost":"$upstream_addr",'
    '"file_dir":"$request_filename",'
    '"http_user_agent":"$http_user_agent"'
  '}';

filebeat 配置

#=========================== Filebeat inputs =============================
filebeat.inputs:
# 收集nginx日志
- type: log
  enabled: true
  paths:
    - /data/wwwlogs/*_nginx.log
# 日志是json开启这个
  json.keys_under_root: true
  json.overwrite_keys: true
  json.add_error_key: true

#-------------------------- Redis output ------------------------------
output.redis:
  hosts: ["host"]   #输出到redis的机器
  password: "password"
  key: "nginx_logs"   #redis中日志数据的key值ֵ
  db: 0
  timeout: 5

logstash 配置

input {
  # redis nginx key
  redis {
    data_type =>"list"
    key =>"nginx_logs"
    host =>"redis"
    port => 6379
    password => "password"
    db => 0
  }
}

filter {
  geoip {
    #multiLang => "zh-CN"
    target => "geoip"
    source => "client_ip"
    database => "/usr/share/logstash/GeoLite2-City.mmdb"
    add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
    add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
    # 去掉显示 geoip 显示的多余信息
    remove_field => ["[geoip][latitude]", "[geoip][longitude]", "[geoip][country_code]", "[geoip][country_code2]", "[geoip][country_code3]", "[geoip][timezone]", "[geoip][continent_code]", "[geoip][region_code]"]
  }
  mutate {
    convert => [ "size", "integer" ]
    convert => [ "status", "integer" ]
    convert => [ "responsetime", "float" ]
    convert => [ "upstreamtime", "float" ]
    convert => [ "[geoip][coordinates]", "float" ]
    # 过滤 filebeat 没用的字段,这里过滤的字段要考虑好输出到es的,否则过滤了就没法做判断
    remove_field => [ "ecs","agent","host","cloud","@version","input","logs_type" ]
  }
  # 根据http_user_agent来自动处理区分用户客户端系统与版本
  useragent {
    source => "http_user_agent"
    target => "ua"
    # 过滤useragent没用的字段
    remove_field => [ "[ua][minor]","[ua][major]","[ua][build]","[ua][patch]","[ua][os_minor]","[ua][os_major]" ]
  }
}
output {
  elasticsearch {
    hosts => "es-master"
    user => "elastic"
    password => "password"
    index => "logstash-nginx-%{+YYYY.MM.dd}"
  }
}

图片

image

Revisions
RevisionDescriptionCreated
Google Cloud logs

Google Cloud logs

by Grafana Labs
Grafana Labs solution

Easily monitor Google Cloud logs with Grafana Cloud's out-of-the-box monitoring solution.

Learn more

Get this dashboard

Import the dashboard template

or

Download JSON

Datasource
Dependencies