使用 .conf 文件 logstash 在日志文件中获取时间戳

Get timestamp in the log file by using .conf file logstash

我有一个 .conf 文件,我想从日志文件中获取时间戳而不是系统日期。我应该在 .conf 文件中通过 logstash 向 Elasticsearch 提供哪些更改?日志很好地提供给 ELK 堆栈,唯一的问题是它获取系统时间而不是日志文件的时间。

我的日志行如下

11812211602170772|2019-12-19 00:00:00 004|SPP_005206|hsenid217@gmail.com|APP_016179|prov|live|IdeaMart||caas|http|94771133726||||subs-rec-charg-notify|sms|unknown|subscriber|16.02|LKR|[{"currencyCode":"LKR", "buyingRate": 1, "sellingRate": 1}]|subscription|94771133726||freeRegistration|P9545|success|success|457d|77850|dialog||percentageFromMonthlyRevenue|70||||S1000|SUCCESS|118122116020074|prorate|0|||||||||||monthly||||Mobile Account||dialog||REG_PENDING|REGISTERED

我的.conf文件如下

input{  file{
        path => "/home/rehan/Projects/SIEM/Splunk/test3/sdp-server/*"
        type => "translog"
        start_position => "beginning"   } }

filter {     grok {
            match => { "message" => "%{NUMBER:field1}\|%{TIMESTAMP_ISO8601:field2} %{NUMBER:field3}\|%{WORD:field4}\|*\|%{WORD:field5}\@%{WORD:field6}\.com\|%{WORD:field7}\|%{WORD:field8}\|%{WORD:field9}\|%{WORD:field10}\|\|%{WORD:field11}\|%{WORD:field12}\|%{NUMBER:field13}\|\|\|\|%{WORD:field14}\-%{WORD:field15}\-%{WORD:field16}\-%{WORD:field17}\|%{WORD:field18}\|%{WORD:field19}\|%{WORD:field20}\|%{NUMBER:field21}\|%{WORD:field22}\|\[\{\"%{WORD:field23}\"\:\"%{WORD:field24}\"\,\ \"%{WORD:field25}\"\:\ %{NUMBER:field26}\, \"%{WORD:field27}\"\: %{NUMBER:field28}\}]\|%{WORD:field29}\|%{NUMBER:field30}\|\|%{WORD:field31}\|%{WORD:field32}\|%{WORD:field33}\|%{WORD:field34}\|%{WORD:field35}\|%{NUMBER:field36}\|%{WORD:field37}\|\|%{WORD:field38}\|%{NUMBER:field39}\|\|\|\|%{WORD:field40}\|%{WORD:field41}\|%{NUMBER:field42}\|%{WORD:field43}\|%{NUMBER:field43}\|\|\|\|\|\|\|\|\|\|\|%{WORD:field44}" }     }

    date{       match => [ "field2" , "yyyy-MM-dd HH:mm:ss SSS" ]       } }

output{     elasticsearch{
        hosts => "127.0.0.1:9200"
        index => "translog"     } }

任何人都可以举例说明我应该如何更改 .conf 文件?

试试下面的答案:

input{
        file{
            path => "/home/rehan/Projects/SIEM/Splunk/test3/sdp-server/*"
            type => "translog"
            start_position => "beginning"
        }
    }

    filter {
         grok {
                match => { "message" => "%{NUMBER:field1}\|%{TIMESTAMP_ISO8601:field2} %{NUMBER:field3}\|%{WORD:field4}\|*\|%{WORD:field5}\@%{WORD:field6}\.com\|%{WORD:field7}\|%{WORD:field8}\|%{WORD:field9}\|%{WORD:field10}\|\|%{WORD:field11}\|%{WORD:field12}\|%{NUMBER:field13}\|\|\|\|%{WORD:field14}\-%{WORD:field15}\-%{WORD:field16}\-%{WORD:field17}\|%{WORD:field18}\|%{WORD:field19}\|%{WORD:field20}\|%{NUMBER:field21}\|%{WORD:field22}\|\[\{\"%{WORD:field23}\"\:\"%{WORD:field24}\"\,\ \"%{WORD:field25}\"\:\
     %{NUMBER:field26}\, \"%{WORD:field27}\"\: %{NUMBER:field28}\}]\|%{WORD:field29}\|%{NUMBER:field30}\|\|%{WORD:field31}\|%{WORD:field32}\|%{WORD:field33}\|%{WORD:field34}\|%{WORD:field35}\|%{NUMBER:field36}\|%{WORD:field37}\|\|%{WORD:field38}\|%{NUMBER:field39}\|\|\|\|%{WORD:field40}\|%{WORD:field41}\|%{NUMBER:field42}\|%{WORD:field43}\|%{NUMBER:field43}\|\|\|\|\|\|\|\|\|\|\|%{WORD:field44}" }
        }
        date {
            match => [ "field2", "yyyy-MM-dd HH:mm:ss" ]
            target => "@timestamp"
        }
    }

    output{ 
        elasticsearch{
            hosts => "127.0.0.1:9200"
            index => "translog"
        }
    }