java Log4j 直接登录到 elasticsearch 服务器

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/44199174/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-11-03 08:00:32  来源:igfitidea点击:

Log4j logging directly to elasticsearch server

javaelasticsearchlogginglog4j

提问by Daria

I'm a bit confused on how can I put my log entries directly to elasticsearch(not logstash). So far I found a few appenders (log4j.appender.SocketAppender, log4j.appender.serveretc.) that allow to send logs to remote host and also ConversionPatternpossibility that seems to allow us to convert logs to "elastic-friendly" format, but this approach looks freaky... or do I mistake? Is this the one way to send logs to elastic?

我对如何将我的日志条目直接放入elasticsearch(而不是 logstash)感到有些困惑。到目前为止,我发现了一些附加目的地(log4j.appender.SocketAppenderlog4j.appender.server等等),允许将日志发送到远程主机,也ConversionPattern可能这似乎让我们日志转换为“弹性友好”的格式,但这种方法看起来古怪......还是我错误?这是将日志发送到的一种方式elastic吗?

So far I have a such config:

到目前为止,我有一个这样的配置:

log4j.rootLogger=DEBUG, server
log4j.appender.server=org.apache.log4j.net.SocketAppender
log4j.appender.server.Port=9200
log4j.appender.server.RemoteHost=localhost
log4j.appender.server.ReconnectionDelay=10000
log4j.appender.server.layout.ConversionPattern={"debug_level":"%p","debug_timestamp":"%d{ISO8601}","debug_thread":"%t","debug_file":"%F", "debug_line":"%L","debug_message":"%m"}%n

But I get an error:

但我收到一个错误:

log4j:WARN Detected problem with connection: java.net.SocketException: Broken pipe (Write failed)

I can't find any useful example so I can't understand what do I do wrong and how to fix it. Thanks.

我找不到任何有用的例子,所以我不明白我做错了什么以及如何解决它。谢谢。

采纳答案by Daria

I found solution that fits my requirements most. It's a graylog. Since it's build based on elasticsearchthe usage is familiar so I was able to switch to it immediately.

我找到了最符合我要求的解决方案。这是一个灰色日志。因为它是基于elasticsearch用法构建的,所以我可以立即切换到它。

To use it I added this dependency along with basic log4j2 dependencies:

为了使用它,我添加了这个依赖项以及基本的 log4j2 依赖项:

<dependency>
    <groupId>org.graylog2.log4j2</groupId>
    <artifactId>log4j2-gelf</artifactId>
    <version>1.3.2</version>
</dependency>

and use log4j2.jsonconfiguration:

并使用log4j2.json配置:

{
  "configuration": {
    "status": "info",
    "name": "LOGGER",
    "packages": "org.graylog2.log4j2",
    "appenders": {
      "GELF": {
        "name": "GELF",
        "server": "log.myapp.com",
        "port": "12201",
        "hostName": "my-awsome-app",
        "JSONLayout": {
          "compact": "false",
          "locationInfo": "true",
          "complete": "true",
          "eventEol": "true",
          "properties": "true",
          "propertiesAsList": "true"
        },
        "ThresholdFilter": {
          "level": "info"
        }
      }
    },
    "loggers": {
      "logger": [
        {
          "name": "io.netty",
          "level": "info",
          "additivity": "false",
          "AppenderRef": {
            "ref": "GELF"
          }
        }        
      ],
      "root": {
        "level": "info",
        "AppenderRef": [
          {
            "ref": "GELF"
          }
        ]
      }
    }
  }
}

回答by Marcelo Grossi

I've written this appender here Log4J2 Elastic REST Appenderif you want to use it. It has the ability to buffer log events based on time and/or number of events before sending it to Elastic (using the _bulk API so that it sends it all in one go). It has been published to Maven Central so it's pretty straight forward.

如果您想使用它,我已经在Log4J2 Elastic REST Appender 中编写了这个 appender 。它能够在将日志事件发送到 Elastic 之前根据时间和/或事件数量缓冲日志事件(使用 _bulk API 以便一次性发送所有事件)。它已发布到 Maven Central,因此非常简单。

回答by bad_habit

If you'd like to check out something new, my Log4j2 Elasticsearch Appenderswill give you async logging in batches with failover.

如果您想查看新内容,我的Log4j2 Elasticsearch Appenders将为您提供具有故障转移功能的批量异步日志记录。