|
一、ELKStack简介
Elstaicsearch:存储和搜索
logstash:收集
kibana:展示
二、ELK----之Logstash安装
环境准备
IP主机名操作系统
192.168.56.11linux-node1centos7
192.168.56.12linux-node2centos7在两台服务器上都安装logstash
1、JDK安装
安装JDK
[root@linux-node1 ~]# yum install -y java
[root@linux-node1 ~]# java -version
openjdk version "1.8.0_91"
OpenJDK Runtime Environment (build 1.8.0_91-b14)
OpenJDK 64-Bit Server VM (build 25.91-b14, mixed mode)2、LogStash安装
[root@linux-node1 ~]# rpm --import https://packages.elastic.co/GPG-KEY-elasticsearch
[root@linux-node1 ~]# cat /etc/yum.repos.d/logstash.repo
[logstash-2.3]
name=Logstash repository for 2.3.x packages
baseurl=https://packages.elastic.co/logstash/2.3/centos
gpgcheck=1
gpgkey=https://packages.elastic.co/GPG-KEY-elasticsearch
enabled=1三、Logstash配置
1、概念理解
input: 标准输入
output: 标准输出
filter: 过滤2、命令行输出
[root@linux-node1 ~]# /opt/logstash/bin/logstash -e 'input { stdin{} } output { stdout{} }'
Settings: Default pipeline workers: 1
Pipeline main started
hehe
2016-08-20T05:58:42.683Z linux-node1 hehe
world
2016-08-20T05:58:52.057Z linux-node1 world3、以json的形式在命令行输出
在标准输出中添加codec=>rubydebug功能,这样能以json的格式显示出来
[root@linux-node2 ~]# /opt/logstash/bin/logstash -e 'input { stdin{} } output { stdout{ codec => rubydebug } }'
Settings: Default pipeline workers: 1
Pipeline main started
四世同堂
{
"message" => "四世同堂",
"@version" => "1",
"@timestamp" => "2016-08-27T18:37:04.702Z",
"host" => "linux-node2"
}4、把日志写入到elasticsearch里面
这里我们用到output的插件
hosts是个数组类型,在集群状态下,可以写多个地址
index是一个字符串类型,可以按照年月日生成索引
[root@linux-node2 ~]# /opt/logstash/bin/logstash -e 'input { stdin{} } output { elasticsearch { hosts => ["192.168.56.11:9200"] index => "logstash-%{+YYYY.MM.dd}" } }'
Settings: Default pipeline workers: 1
Pipeline main started
天苍苍,野茫茫
风吹草地见牛羊在es上访问http://192.168.56.11:9200/_plugin/head/ 在数据浏览页面找到当天的logstash日志,
比如logstash-2016.08.30,查看其message信息。
5、把日志输出到不同位置
我们有时候会有这样的需求,即把日志输出到标准输出,又输出到es
[root@linux-node2 ~]# /opt/logstash/bin/logstash -e 'input { stdin{} } output { stdout { codec => rubydebug } elasticsearch { hosts => ["192.168.56.11:9200"] index => "logstash-%{+YYYY.MM.dd}" } }'
Settings: Default pipeline workers: 1
Pipeline main started
菊花残,满地伤
{
"message" => "菊花残,满地伤",
"@version" => "1",
"@timestamp" => "2016-08-30T15:36:22.606Z",
"host" => "linux-node2"
}
你的笑容已泛黄
{
"message" => "你的笑容已泛黄",
"@version" => "1",
"@timestamp" => "2016-08-30T15:36:31.630Z",
"host" => "linux-node2"
}在实际生产中,每个需要收集日志的服务器都安装一个logstash
通过网络传输的不需要安装,比如syslog,可以通过网络发送6、编写logstash的输入输出文件
logstash的输入输出文件都存放在
/etc/logstash/conf.d
这是在启动脚本中定义了
PATH=/sbin:/usr/sbin:/bin:/usr/bin
export PATH
if [ `id -u` -ne 0 ]; then
echo "You need root privileges to run this script"
exit 1
fi
name=logstash
pidfile="/var/run/$name.pid"
LS_USER=root
LS_GROUP=root
LS_HOME=/var/lib/logstash
LS_HEAP_SIZE="1g"
LS_LOG_DIR=/var/log/logstash
LS_LOG_FILE="${LS_LOG_DIR}/$name.log"
LS_CONF_DIR=/etc/logstash/conf.d
......编写一个demo.conf文件
[root@linux-node2 conf.d]# vim demo.conf
input{
stdin{}
}
filter{
}
output{
elasticsearch{
hosts => ["192.168.56.11:9200"]
index => "logstash-%{+YYYY.MM.dd}"
}
stdout{
codec => rubydebug
}
}在终端执行一下,查看结果
[root@linux-node2 conf.d]# /opt/logstash/bin/logstash -f demo.conf
Settings: Default pipeline workers: 1
Pipeline main started
谁的江山
{
"message" => "谁的江山",
"@version" => "1",
"@timestamp" => "2016-08-30T15:59:48.446Z",
"host" => "linux-node2"
}
马蹄声狂乱
{
"message" => "马蹄声狂乱",
"@version" => "1",
"@timestamp" => "2016-08-30T15:59:59.306Z",
"host" => "linux-node2"7、收取操作系统日志并写入elasticsearch
需求分析:
将"/var/log/message","/var/log/secure"写入到ES中
自定义一个名为file.conf的文件
[root@linux-node2 /etc/logstash/conf.d]# cat file.conf
input{
file{
path => ["/var/log/message","/var/log/secure"]
type => "system-log"
start_position => "beginning" #从文件头开始收取日志
}
}
filter{}
output{
elasticsearch{
hosts => ["192.168.56.11:9200"]
index => "system-log-%{+YYYY.MM}"
}
}启动logstash
[root@linux-node2 conf.d]# /opt/logstash/bin/logstash -f file.conf
Settings: Default pipeline workers: 1
Pipeline main started
注意:
注意:
注意:
logstash在后台启动时会加载/etc/logstash/conf.d/目录下的所有conf文件在es上访问http://192.168.56.11:9200/_plugin/head/ 查看syslog-log的日志
|
|
|