如何在 Elastic APM 中查看来自 OpenTelemetry 的跟踪日志
How to Viewing trace logs from OpenTelemetry in Elastic APM
我从 Elastic APM 中的 opentelemetry-collector 接收日志
日志结构:
"{Timestamp:HH:mm:ss} {Level:u3} trace.id={TraceId} transaction.id={SpanId}{NewLine}{Message:lj }{NewLine}{Exception}"
示例:
08:27:47 INF trace.id=898a7716358b25408d4f193f1cd17831 transaction.id=4f7590e4ba80b64b SOME MSG
我试过使用管道
POST _ingest/pipeline/_simulate { "pipeline": { "description" : "parse multiple patterns", "processors": [
{
"grok": {
"field": "message",
"patterns": ["%{TIMESTAMP_ISO8601:logtime} %{LOGLEVEL:loglevel} \[trace.id=%{TRACE_ID:trace.id}(?: transaction.id=%{SPAN_ID:transaction.id})?\] %{GREEDYDATA:message}"],
"pattern_definitions": {
"TRACE_ID": "[0-9A-Fa-f]{32}",
"SPAN_ID": "[0-9A-Fa-f]{16}"
}
},
"date": { "field": "logtime", "target_field": "@timestamp", "formats": ["HH:mm:ss"] }
} ] } }
我的目标是查看 Elastic APM 中的日志
{
"@timestamp": 2021-01-05T10:10:10",
"message": "Protocol Port MIs-Match",
"trace": {
"traceId": "898a7716358b25408d4f193f1cd17831",
"spanId": "4f7590e4ba80b64b"
}
}
到目前为止做得很好。您的管道几乎是好的,但是,grok 模式需要一些修复并且您有一些孤儿花括号。这是一个工作示例:
POST _ingest/pipeline/_simulate
{
"pipeline": {
"description": "parse multiple patterns",
"processors": [
{
"grok": {
"field": "message",
"patterns": [
"""%{TIME:logtime} %{WORD:loglevel} trace.id=%{TRACE_ID:trace.id}(?: transaction.id=%{SPAN_ID:transaction.id})? %{GREEDYDATA:message}"""
],
"pattern_definitions": {
"TRACE_ID": "[0-9A-Fa-f]{32}",
"SPAN_ID": "[0-9A-Fa-f]{16}"
}
}
},
{
"date": {
"field": "logtime",
"target_field": "@timestamp",
"formats": [
"HH:mm:ss"
]
}
}
]
},
"docs": [
{
"_source": {
"message": "08:27:47 INF trace.id=898a7716358b25408d4f193f1cd17831 transaction.id=4f7590e4ba80b64b SOME MSG"
}
}
]
}
回复:
{
"docs" : [
{
"doc" : {
"_index" : "_index",
"_type" : "_doc",
"_id" : "_id",
"_source" : {
"trace" : {
"id" : "898a7716358b25408d4f193f1cd17831"
},
"@timestamp" : "2021-01-01T08:27:47.000Z",
"loglevel" : "INF",
"message" : "SOME MSG",
"logtime" : "08:27:47",
"transaction" : {
"id" : "4f7590e4ba80b64b"
}
},
"_ingest" : {
"timestamp" : "2021-03-30T11:07:52.067275598Z"
}
}
}
]
}
请注意缺少确切日期,因此 @timestamp 字段解析为今年 1 月 1 日。
我从 Elastic APM 中的 opentelemetry-collector 接收日志 日志结构:
"{Timestamp:HH:mm:ss} {Level:u3} trace.id={TraceId} transaction.id={SpanId}{NewLine}{Message:lj }{NewLine}{Exception}"
示例:
08:27:47 INF trace.id=898a7716358b25408d4f193f1cd17831 transaction.id=4f7590e4ba80b64b SOME MSG
我试过使用管道
POST _ingest/pipeline/_simulate { "pipeline": { "description" : "parse multiple patterns", "processors": [
{
"grok": {
"field": "message",
"patterns": ["%{TIMESTAMP_ISO8601:logtime} %{LOGLEVEL:loglevel} \[trace.id=%{TRACE_ID:trace.id}(?: transaction.id=%{SPAN_ID:transaction.id})?\] %{GREEDYDATA:message}"],
"pattern_definitions": {
"TRACE_ID": "[0-9A-Fa-f]{32}",
"SPAN_ID": "[0-9A-Fa-f]{16}"
}
},
"date": { "field": "logtime", "target_field": "@timestamp", "formats": ["HH:mm:ss"] }
} ] } }
我的目标是查看 Elastic APM 中的日志
{
"@timestamp": 2021-01-05T10:10:10",
"message": "Protocol Port MIs-Match",
"trace": {
"traceId": "898a7716358b25408d4f193f1cd17831",
"spanId": "4f7590e4ba80b64b"
}
}
到目前为止做得很好。您的管道几乎是好的,但是,grok 模式需要一些修复并且您有一些孤儿花括号。这是一个工作示例:
POST _ingest/pipeline/_simulate
{
"pipeline": {
"description": "parse multiple patterns",
"processors": [
{
"grok": {
"field": "message",
"patterns": [
"""%{TIME:logtime} %{WORD:loglevel} trace.id=%{TRACE_ID:trace.id}(?: transaction.id=%{SPAN_ID:transaction.id})? %{GREEDYDATA:message}"""
],
"pattern_definitions": {
"TRACE_ID": "[0-9A-Fa-f]{32}",
"SPAN_ID": "[0-9A-Fa-f]{16}"
}
}
},
{
"date": {
"field": "logtime",
"target_field": "@timestamp",
"formats": [
"HH:mm:ss"
]
}
}
]
},
"docs": [
{
"_source": {
"message": "08:27:47 INF trace.id=898a7716358b25408d4f193f1cd17831 transaction.id=4f7590e4ba80b64b SOME MSG"
}
}
]
}
回复:
{
"docs" : [
{
"doc" : {
"_index" : "_index",
"_type" : "_doc",
"_id" : "_id",
"_source" : {
"trace" : {
"id" : "898a7716358b25408d4f193f1cd17831"
},
"@timestamp" : "2021-01-01T08:27:47.000Z",
"loglevel" : "INF",
"message" : "SOME MSG",
"logtime" : "08:27:47",
"transaction" : {
"id" : "4f7590e4ba80b64b"
}
},
"_ingest" : {
"timestamp" : "2021-03-30T11:07:52.067275598Z"
}
}
}
]
}
请注意缺少确切日期,因此 @timestamp 字段解析为今年 1 月 1 日。