前面做了关于ELK组件的各个实验,但是并没有真正的把各个组件结合起来做一个实验,现在使用一个脚本简单的生成日志,模拟生产不断产生日志的效果
使用脚本产生日志,模拟用户的操作
日志的格式
[INFO] 2019-11-30 18:00:20 [cn.success.dashboard.Main] -DAU|5206|使用优惠卷|2019-11-30 18:20:23
日志的格式时"DAU" + userID + "|" + visit + "|" +date
通过Filebeat读取日志文件的内容,并将内容发送给Logstash,,原因时需要对内容做处理
Logstash接收到内容后,进行处理,如分割操作,然后将内容发送到 Elasticsearch中
Kiana会读取 Elasticsearch中的数据,并且在 Kiana中进行设计 Dashboard,最后进行展示
后面的日志格式,图表,Dashboard都是自定义的
为了方便实验,脚本很简陋,只是一个输入工具,没有什么实际作用
#!/bin/bash visit_array=("浏览页面" "评论商品" "加入收藏" "加入购物车" "提交订单" "使用优惠卷" "领取优惠卷" "搜索" "查看订单") visit_number=`head /dev/urandom | cksum | cut -c 1-1` id_number=`head /dev/urandom | cksum | cut -c 1-4` echo "[INFO] `date +%F` `date|awk ‘{print $4}‘` [cn.success.dashboard.Main] - DAU|$id_number|${visit_array[$visit_number]}|`date +%F` `date|awk ‘{print $4}‘`"
简单执行一下,看一下效果
[root@node4 ~]# sh /opt/logs.sh [INFO] 2019-11-30 05:35:28 [cn.success.dashboard.Main] - DAU|3149|查看订单|2019-11-30 05:35:28 [root@node4 ~]# sh /opt/logs.sh [INFO] 2019-11-30 05:35:29 [cn.success.dashboard.Main] - DAU|2686|搜索|2019-11-30 05:35:29 [root@node4 ~]# sh /opt/logs.sh [INFO] 2019-11-30 05:35:30 [cn.success.dashboard.Main] - DAU|2922|加入收藏|2019-11-30 05:35:30 [root@node4 ~]# sh /opt/logs.sh [INFO] 2019-11-30 05:35:30 [cn.success.dashboard.Main] - DAU|1680|加入收藏|2019-11-30 05:35:30 [root@node4 ~]# sh /opt/logs.sh [INFO] 2019-11-30 05:35:31 [cn.success.dashboard.Main] - DAU|1987|加入购物车|2019-11-30 05:35:31
然后写一个死循环,两秒执行一次,把这个输入一个日志文件里模拟
[root@node4 ~]# while > : > do > sh /opt/logs.sh >> /var/log/elk-test.log > sleep 2 > done
[root@node4 ~]# tail -f /var/log/elk-test.log
[INFO] 2019-11-30 05:40:52 [cn.success.dashboard.Main] - DAU|2461|评论商品|2019-11-30 05:40:52 [INFO] 2019-11-30 05:40:54 [cn.success.dashboard.Main] - DAU|1463|加入购物车|2019-11-30 05:40:54 [INFO] 2019-11-30 05:40:56 [cn.success.dashboard.Main] - DAU|3408|加入收藏|2019-11-30 05:40:56 [INFO] 2019-11-30 05:40:58 [cn.success.dashboard.Main] - DAU|2821|加入收藏|2019-11-30 05:40:58 [INFO] 2019-11-30 05:41:00 [cn.success.dashboard.Main] - DAU|3630|加入购物车|2019-11-30 05:41:00 [INFO] 2019-11-30 05:41:02 [cn.success.dashboard.Main] - DAU|4136|加入购物车|2019-11-30 05:41:02 [INFO] 2019-11-30 05:41:05 [cn.success.dashboard.Main] - DAU|1673|加入收藏|2019-11-30 05:41:05 [INFO] 2019-11-30 05:41:07 [cn.success.dashboard.Main] - DAU|1073|评论商品|2019-11-30 05:41:07 [INFO] 2019-11-30 05:41:09 [cn.success.dashboard.Main] - DAU|3007|提交订单|2019-11-30 05:41:09 [INFO] 2019-11-30 05:41:11 [cn.success.dashboard.Main] - DAU|2361|评论商品|2019-11-30 05:41:11 [INFO] 2019-11-30 05:41:13 [cn.success.dashboard.Main] - DAU|2634|加入收藏|2019-11-30 05:41:13 [INFO] 2019-11-30 05:41:15 [cn.success.dashboard.Main] - DAU|1799|加入购物车|2019-11-30 05:41:15 [INFO] 2019-11-30 05:41:17 [cn.success.dashboard.Main] - DAU|3545|评论商品|2019-11-30 05:41:17 [INFO] 2019-11-30 05:41:19 [cn.success.dashboard.Main] - DAU|3488||2019-11-30 05:41:19 [INFO] 2019-11-30 05:41:21 [cn.success.dashboard.Main] - DAU|1711|加入购物车|2019-11-30 05:41:21 [INFO] 2019-11-30 05:41:23 [cn.success.dashboard.Main] - DAU|5896|加入购物车|2019-11-30 05:41:23
基本达到一个日志的效果
[root@node4 ~]# cd /usr/local/filebeat/
[root@node4 filebeat]# vi elk-test.yml
filebeat.inputs: - type: log enabled: true paths: - /var/log/elk-test.log setup.template.settings: index.number_of_shards: 3 output.logstash: hosts: ["192.168.132.131:5044"]
[root@node1 logstash]# vi elk-test.conf
input { beats{ port => "5044" } } filter{ mutate { split => {"message" => "|"} } mutate { add_field =>{ "UserId" => "%{[message][1]}" "visit" => "%{[message][2]}" "date" => "%{[message][3]}" } } } output { stdout {codec => rubydebug} }
先输出到控制台
[root@node1 logstash]# bin/logstash -f elk-test.conf
[2019-11-30T05:58:26,028][INFO ][org.logstash.beats.Server][main] Starting server on port: 5044 [2019-11-30T05:58:26,864][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
root@node1 ~]# netstat -antlup|grep 5044
再启动filebeat
[root@node4 filebeat]# ./filebeat -e -c elk-test.yml
查看控制台的输出情况
[root@node1 ~]# vim /usr/local/logstash/elk-test.conf
input { beats{ port => "5044" } } filter{ mutate { split => {"message" => "|"} } mutate { add_field =>{ "UserId" => "%{[message][1]}" "visit" => "%{[message][2]}" "DateTime" => "%{[message][3]}" } } mutate{ convert => { "UserId" => "integer" "visit" => "string" "DateTime" => "string" } } } output { stdout {codec => rubydebug} }
启动再看
数据处理完成
[root@node1 logstash]# vim elk-test.conf
input { beats{ port => "5044" } } filter{ mutate { split => {"message" => "|"} } mutate { add_field =>{ "UserId" => "%{[message][1]}" "visit" => "%{[message][2]}" "DateTime" => "%{[message][3]}" } } mutate{ convert => { "UserId" => "integer" "visit" => "string" "DateTime" => "string" } } } output { elasticsearch{ hosts => ["192.168.132.131:9200","192.168.132.132:9200","192.168.132.133:9200"] } }
启动
[root@node1 logstash]# bin/logstash -f elk-test.conf
使用elasticsearch head查看
这样就把所有数据收集到elasticsearch上
详细步骤可参考前面的自定义图表https://www.cnblogs.com/zyxnhr/p/11954663.html
创建结果
保存
添加一个饼图,依然选择logstash
左侧过滤
添加一个lable
添加前后显示效果
依次添加所有动作
添加完后效果
设置选项
结果如下
保存
这样显示
在数据探索中进行保存,将各个操作的数据以表格的形式展示出来
创建新的dashboard
调正一下界面。就可以得到下面的界面
调整名字
保存
整个的实验完成
参考: https://www.bilibili.com/video/av67957955?p=64
原文:https://www.cnblogs.com/zyxnhr/p/11963622.html