|
在工作中需要在elk中展示tomcat的日志,对其进行分析,报错排查及其它定制需求;
下面为catalina.out日志样例,以此为样本进行字段的解析;注:可以看到此段日志有规律的,由此可以对其进行解析;
2018-08-08 00:00:16,728 INFO [com.staples.adv.action.BaseShopAction] -
2018-08-08 00:00:16,728 INFO [com.staples.adv.action.BaseShopAction] -
2018-08-08 00:11:10,463 WARN [org.springframework.web.servlet.PageNotFound] -
2018-08-08 00:11:10,647 WARN [org.springframework.web.servlet.PageNotFound] -
2018-08-08 00:18:13,061 ERROR [org.vancebox.exception.ExceptionResolver] -
2018-08-08 00:19:30,178 INFO [com.staples.adv.action.BaseShopAction] -
2018-08-08 00:19:30,178 INFO [com.staples.adv.action.BaseShopAction] - 通过在grokdebug上面进行调试,日志字段可以解析为:
%{TIMESTAMP_ISO8601:access_time} %{LOGLEVEL:loglevel} \[%{DATA:exception_info}\] - \ 注1:将catalina.out的日志解析为:日志时间、日志级别、异常提示、异常内容
注2:MESSAGE是自己定义的匹配规则,用于匹配任何数据(包括单行和多行):MESSAGE [\s\S]*
filebeat先读取catalina.out的日志,写入redis中:
filebeat.inputs:
- type: log
paths:
- /data/sadv/10.78.3.129/test5.out
tags: ["sadv"]
fields:
type: sadv
fields_under_root: true
multiline:
pattern: '^\d+-\d+-\d+ \d+:\d+:\d+'
negate: true
match: after
output.redis:
hosts: ["10.78.1.181"]
key: "sadv"
datatype: list logstash向redis读取数据,解析过滤之后写入elastic中:
input {
redis {
host => "10.78.1.181"
port => 6379
db => "0"
data_type => "list"
key => "sadv"
type => "sadv"
}
}
filter {
mutate {
remove_field => ["@version","prospector","input","beat","source","offset"]
}
grok {
match => {
"message" => "%{TIMESTAMP_ISO8601:access_time} %{LOGLEVEL:loglevel} \[%{DATA:exception_info}\] - \"
}
pattern_definitions => {
"MESSAGE" => "[\s\S]*"
}
}
date {
match => [ "access_time","yyyy-MM-dd HH:mm:ss,SSS" ]
}
mutate {
remove_field => ["access_time","[message][0]"]
}
}
output {
if [type] == "sadv" {
if [tags][0] == "sadv" {
elasticsearch {
hosts => ["10.78.1.184:9200","10.78.1.185:9200","10.78.1.188:9200"]
index => "%{type}-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug }
}
}
} 注:output中的stdout为调试所用,生产建议删除
|
|
|