Log4j 记录到 STDOUT,然后格式化为 Logstash 的 JSON 布局

Log4j Log to STDOUT then format to JSON layout for Logstash

我在 Kubernetes 集群中有一个 Spring 引导应用程序 运行 和一个 EFK 堆栈(如 ELK,但 Fluentd 代替了 Logstash,被用作从所有 kubernetes pods 收集日志并将它们发送到 elasticsearch 的轻量级替代方案。

为了使日志适应 JSON 输出,我使用了 logstash-logback-encoder 库:

<dependency>
  <groupId>net.logstash.logback</groupId>
  <artifactId>logstash-logback-encoder</artifactId>
  <version>4.11</version>
</dependency>

开箱即用,我将日志转换为 JSON(很棒)。

我登录到 STDOUT,所有内容都被提取并发送到 Elasticsearch。 Spring 引导应用程序内不需要特殊的日志记录配置。

但我现在遇到的问题是,当我从 Kubernetes pod 的 STDOUT 实时读取我的日志时,它们很难用所有 JSON格式化。

示例:

{"@timestamp":"2018-02-08T12:49:06.080+01:00","@version":1,"message":"Mapped \"{[/error],produces=[text/html]}\" onto public org.springframework.web.servlet.ModelAndView org.springframework.boot.autoconfigure.web.BasicErrorController.errorHtml(javax.servlet.http.HttpServletRequest,javax.servlet.http.HttpServletResponse)","logger_name":"org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping","thread_name":"main","level":"INFO","level_value":20000}
{"@timestamp":"2018-02-08T12:49:06.080+01:00","@version":1,"message":"Mapped \"{[/error]}\" onto public org.springframework.http.ResponseEntity<java.util.Map<java.lang.String, java.lang.Object>> org.springframework.boot.autoconfigure.web.BasicErrorController.error(javax.servlet.http.HttpServletRequest)","logger_name":"org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping","thread_name":"main","level":"INFO","level_value":20000}
{"@timestamp":"2018-02-08T12:49:06.098+01:00","@version":1,"message":"Mapped URL path [/webjars/**] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler]","logger_name":"org.springframework.web.servlet.handler.SimpleUrlHandlerMapping","thread_name":"main","level":"INFO","level_value":20000}
{"@timestamp":"2018-02-08T12:49:06.098+01:00","@version":1,"message":"Mapped URL path [/**] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler]","logger_name":"org.springframework.web.servlet.handler.SimpleUrlHandlerMapping","thread_name":"main","level":"INFO","level_value":20000}
{"@timestamp":"2018-02-08T12:49:06.137+01:00","@version":1,"message":"Mapped URL path [/**/favicon.ico] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler]","logger_name":"org.springframework.web.servlet.handler.SimpleUrlHandlerMapping","thread_name":"main","level":"INFO","level_value":20000}
{"@timestamp":"2018-02-08T12:49:06.268+01:00","@version":1,"message":"Registering beans for JMX exposure on startup","logger_name":"org.springframework.jmx.export.annotation.AnnotationMBeanExporter","thread_name":"main","level":"INFO","level_value":20000}
{"@timestamp":"2018-02-08T12:49:06.333+01:00","@version":1,"message":"Initializing ProtocolHandler [\"http-nio-8080\"]","logger_name":"org.apache.coyote.http11.Http11NioProtocol","thread_name":"main","level":"INFO","level_value":20000}
{"@timestamp":"2018-02-08T12:49:06.355+01:00","@version":1,"message":"Starting ProtocolHandler [\"http-nio-8080\"]","logger_name":"org.apache.coyote.http11.Http11NioProtocol","thread_name":"main","level":"INFO","level_value":20000}

我想做的是以'normal non-JSON'格式登录到STDOUT,然后以JSON格式将日志发送到Fluentd。

我正在尝试配置两个日志附加器(一个到 STDOUT,另一个以 JSON 格式用于 Fluentd)但我很确定这会复制数据(Fluentd 将获得 JSON格式和标准输出)。

我的计划 B 是构建一个用于部署的映像(没有 JSON 格式)和另一个用于生产,但这更像是计划 Z,因为我想监视那些 pods生产也是如此。

我的问题是我怎样才能用一个日志附加器或不在 Fluentd 中复制数据来做到这一点。是否有我没有想到的不同方法?

我假设您使用的是 Logback 而不是您标记的 Log4j,因为您链接到的 logstash 库似乎是为 Logback 编写的。

最简单的解决方案可能是将 fluentd 配置为从文件中读取日志并将 JSON appender 重定向到该文件。

有一个关于尾部输入插件的article,但基本上你是这样配置它的:

logback.xml

<?xml version="1.0" encoding="UTF-8"?>
<configuration>

  <!-- JSON appender for log collection -->
  <appender name="json" class="ch.qos.logback.core.rolling.RollingFileAppender">
    <file>/some/path/to/your/file.log</file>
    <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
      <fileNamePattern>/some/path/to/your/file.log.%d{yyyy-MM-dd}</fileNamePattern>
      <maxHistory>30</maxHistory>
    </rollingPolicy>
    <encoder class="net.logstash.logback.encoder.LogstashEncoder" />
  </appender>

  <!-- Console appender for humans -->
  <appender name="console" class="ch.qos.logback.core.ConsoleAppender">
    <!-- Set threshold for the console log here if you want the
      log collection to get all log messages regardless of level -->
    <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
        <level>INFO</level>
    </filter>
    <!-- encoders are assigned the type
      ch.qos.logback.classic.encoder.PatternLayoutEncoder by default -->
    <encoder>
      <pattern>%-4relative [%thread] %-5level %logger{35} - %msg %n</pattern>
    </encoder>
  </appender>

  <!-- Tie it all together -->
  <root level="all">
    <appender-ref ref="json" />
    <appender-ref ref="console" />
  </root>
</configuration>

流利

<source>
  @type tail
  path /some/path/to/your/file.log
  pos_file /some/path/to/your/file.log
  format json
</source>

根据文档,fluentd 会在完成旧文件后跟随翻转并从新文件的开头开始。

如果您希望控制台输出与常规 Spring 启动应用程序相同,您可以从 their configuration

复制模式

尽管我很想提出一个解决方案,但最后我只是使用 jq,一个 json parser 在 cli 上查看我的日志。我这样做是为了避免重复日志数据,而不必创建文件,也不必专门配置 fluentd 以从文件中读取日志。