ELK环境搭建

2018-12-01  本文已影响0人  枯木风

一、配置

系统: Windows 8.1
elasticsearch:5.5.1
logstash:2.0.0
kibana:5.5.1

注:由于实验性搭建,选择windows系统,但选择Linux系统效果更佳

二、部署方案

1.ELK+Redis
2.ELK+Kafka

注:本次搭建选用第一种方案

三、安装

前提:下载nssm

1. Elasticsearch
下载: download
2. logstash
下载: download
3. kibana
下载: download
注册为windows服务
(a) 将下载的nssm.exe分别拷贝到Elasticsearch、logstash和kibana解压后的bin目录下,然后CMD进入bin执行nssm install 服务名,例如Elasticsearch 的执行nssm install elasticsearch-service..
(b) 分析选择path为各压缩包的bin目录下的elasticsearch.bat、logstash.bat和kibana.bat
(c) Details选项卡设置显示名为Windows名
(d) 最后选择Install service

四、部署

1. 创建Maven项目elk-log(可另外取名),pom文件为:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.suncj</groupId>
    <artifactId>elk-log</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>elk-log</name>
    <description>elk日志生成项目</description>

    <dependencies>
        <dependency>
            <groupId>org.springframework</groupId>
            <artifactId>spring-web</artifactId>
            <version>4.2.8.RELEASE</version>
        </dependency>

        <dependency>
            <groupId>org.eclipse.jetty.aggregate</groupId>
            <artifactId>jetty-all</artifactId>
            <version>8.1.19.v20160209</version>
        </dependency>

        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-api</artifactId>
            <version>1.7.12</version>
        </dependency>
        <dependency>
            <groupId>ch.qos.logback</groupId>
            <artifactId>logback-core</artifactId>
            <version>1.2.3</version>
        </dependency>
        <dependency>
            <groupId>net.logstash.logback</groupId>
            <artifactId>logstash-logback-encoder</artifactId>
            <version>4.9</version>
        </dependency>
        <!--实现slf4j接口并整合 -->
        <dependency>
            <groupId>ch.qos.logback</groupId>
            <artifactId>logback-classic</artifactId>
            <version>1.2.3</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-databind</artifactId>
            <version>2.7.0</version>
        </dependency>
        <dependency>
            <groupId>javax.servlet</groupId>
            <artifactId>javax.servlet-api</artifactId>
            <version>3.1.0</version>
        </dependency>
    </dependencies>
</project>
2. 配置logback,logback.xml文件为:
<?xml version="1.0" encoding="UTF-8"?>
<configuration debug="false">

    <appender name="console" class="ch.qos.logback.core.ConsoleAppender">
        <encoder class="ch.qos.logback.classic.encoder.PatternLayoutEncoder">
            <!-- 格式化输出:%d表示日期,%thread表示线程名,%-5level:级别从左显示5个字符宽度%msg:日志消息,%n是换行符 -->
            <pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %c{1}.%M:%L - %m%n
            </pattern>
        </encoder>
    </appender>

    <appender name="stash"
        class="net.logstash.logback.appender.LogstashTcpSocketAppender">
        <destination>127.0.0.1:9250</destination>
        <encoder charset="UTF-8"
            class="net.logstash.logback.encoder.LogstashEncoder" />
    </appender>

    <logger name="com.suncj" level="INFO" />

    <root level="INFO">
        <appender-ref ref="console" />
        <appender-ref ref="stash" />
    </root>

</configuration>
3.设置项目定时任务(打日志)

定时任务类LogProducer:

package com.suncj.elk;

import java.util.Random;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

/**
 * 日志生成器<br>
 * 版权:Copyright (c) 2015-2016<br>
 * 创建日期:2017年8月5日<br>
 */
public class LogProducer {
    private static final Logger log = LoggerFactory.getLogger(LogProducer.class);

    private Random rand = new Random();

    private static int logId = 0;

    public void produce() {
        log.info("log_id: {} , content:{}", logId, String.format("I am %s", logId + rand.nextInt(100000)));
        logId++;
    }

}

项目启动类:

package com.suncj.elk;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.context.ApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;

public class Application {
    private static Logger logger = LoggerFactory.getLogger(Application.class);

    public static ApplicationContext appContext;

    public static void main(String[] args) {
        try {
            logger.info("准备加载程序");
            appContext = new ClassPathXmlApplicationContext("app-*.xml");
            logger.info("加载完成");
        } catch (Exception e) {
            logger.error("主程序出错:", e);
        }

    }

}

其他配置文件:app-task.xml

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:util="http://www.springframework.org/schema/util"
    xmlns:task="http://www.springframework.org/schema/task" xmlns:context="http://www.springframework.org/schema/context"
    xsi:schemaLocation="http://www.springframework.org/schema/task http://www.springframework.org/schema/task/spring-task.xsd
        http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
        http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util.xsd
        http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context.xsd">

    <bean id="logProducer" class="com.suncj.elk.LogProducer"></bean>

    <task:scheduled-tasks>
        <task:scheduled ref="logProducer" method="produce"
            cron="0/5 * * * * *" />
    </task:scheduled-tasks>
</beans>

2. logstash配置
(a) run_es.bat,run_redis.bat
logstash.bat agent -f logstash_es.conf
(b) logstash_redis.conf
input {
    tcp {
        host => "127.0.0.1"
        port => 9250
        mode => "server"
        codec => json_lines
    }
}
output {
    redis {
        host => "127.0.0.1"
        port => 6379
        db => 1
        data_type => "list"
        key => "log:es"
    }
}
(c) logstash_es.conf
input {
  redis {
    data_type => "list"
    key => "log:es"
    host => "127.0.0.1"
    db => 1
    port => 6379
  }
}
output {
  stdout{
    codec => rubydebug
  }
  elasticsearch {
    hosts => ["127.0.0.1:9200"]
    index => "log-es-%{+YYYY.MM.dd}"
    flush_size => 1000
  }
}

注: logstash注册为windows服务时需要
创建两个bat文件,一个用于项目日志存储到redis;另外一个用户读取redis,输出到elasticsearch,因此需要注册两个服务名不同的windows服务

参考资料

https://kibana.logstash.es/content/kibana/index.html

http://blog.csdn.net/tulizi/article/details/52972824

http://udn.yyuap.com/doc/logstash-best-practice-cn/input/redis.html

https://www.elastic.co/guide/en/logstash/current/codec-plugins.html

上一篇 下一篇

猜你喜欢

热点阅读