如何grep多行堆栈跟踪?失效线程是用我的解决方案创建的
How to grep multiline stack trace? Defunct threads are created with my solution
我有一个文件 (application.log),我的应用程序在其中存储它们的日志。有时我有一个异常,我想在另一个文件 (exception.log) 中仅存储 这些异常 。每行日志从日期开始,格式如下:
[2017-28-09 10:00:00,000] Text of log number 1
[2017-28-09 10:00:05,000] Text of log number 2
[2017-28-09 10:00:10,000] Text of Exception number 1
at bla bla bla
at bla bla bla
at bla bla bla
at bla bla bla
[2017-28-09 10:00:15,000] Text of log number 4
在这种情况下 exception.log 应该存储:
[2017-28-09 10:00:10,000] Text of Exception number 1
at bla bla bla
at bla bla bla
at bla bla bla
at bla bla bla
我试过这样的方法:
kill $(ps -ef | grep tail | awk '{print }')
tail -f /path/to/my/application.log | pcregrep -M 'Exception[^\[]+' | while read line
do
(echo "$line" >> /path/to/my/exception.log)
done &
这个解决方案成功地完成了这项工作,但它也创建了许多线程,并且我系统中的总线程数急剧增加。因此,我需要使用其他解决方案或解决 "defunct issue".
伙计们,你们知道如何只剪切、grep 或复制异常堆栈跟踪到另一个文件或防止不打开已失效的线程吗?
告诉 grep 在匹配以 space 开头的行之前打印一行。
grep -B1 '^ '
$ cat test.txt
[2017-28-09 10:00:00,000] Text of log number 1
[2017-28-09 10:00:05,000] Text of log number 2
[2017-28-09 10:00:10,000] Text of Exception number 1
at bla bla bla
at bla bla bla
at bla bla bla
at bla bla bla
[2017-28-09 10:00:15,000] Text of log number 4
$ awk '/\[.*\]/{d=0; if([=10=] ~ "Exception")d=1}d' test.txt
[2017-28-09 10:00:10,000] Text of Exception number 1
at bla bla bla
at bla bla bla
at bla bla bla
at bla bla bla
下面是@CWLiu 的解决方案和没有 "tail -f" 的解决方案。简单但可用,并且有 none 个僵尸进程或 tail -f 线程挂出:
#!/bin/bash
# Below I checked if last.log file exists
if [ ! -f /path/to/my/last.log ]; then
echo "" > /path/to/my/last.log
fi
# Thanks @CWLiu again for your below solution
awk '/\[.*\]/{d=0; if([=10=] ~ "Exception")d=1}d' /path/to/my/application.log > /path/to/my/now.log
# Creation differential file - now.log stores all exceptions, last.log stores all exceptions from previous check
diff /path/to/my/now.log /path/to/my/last.log > /path/to/my/tmp.log
diff /path/to/my/last.log /path/to/my/now.log | grep ">" | tr -d '>' >> /path/to/my/exception.log
现在我可以在 crontab 中添加这个文件的执行来检查,即每分钟是否发生异常:
* * * * * /path/to/my/file.sh
我有一个文件 (application.log),我的应用程序在其中存储它们的日志。有时我有一个异常,我想在另一个文件 (exception.log) 中仅存储 这些异常 。每行日志从日期开始,格式如下:
[2017-28-09 10:00:00,000] Text of log number 1
[2017-28-09 10:00:05,000] Text of log number 2
[2017-28-09 10:00:10,000] Text of Exception number 1
at bla bla bla
at bla bla bla
at bla bla bla
at bla bla bla
[2017-28-09 10:00:15,000] Text of log number 4
在这种情况下 exception.log 应该存储:
[2017-28-09 10:00:10,000] Text of Exception number 1
at bla bla bla
at bla bla bla
at bla bla bla
at bla bla bla
我试过这样的方法:
kill $(ps -ef | grep tail | awk '{print }')
tail -f /path/to/my/application.log | pcregrep -M 'Exception[^\[]+' | while read line
do
(echo "$line" >> /path/to/my/exception.log)
done &
这个解决方案成功地完成了这项工作,但它也创建了许多线程,并且我系统中的总线程数急剧增加。因此,我需要使用其他解决方案或解决 "defunct issue".
伙计们,你们知道如何只剪切、grep 或复制异常堆栈跟踪到另一个文件或防止不打开已失效的线程吗?
告诉 grep 在匹配以 space 开头的行之前打印一行。
grep -B1 '^ '
$ cat test.txt
[2017-28-09 10:00:00,000] Text of log number 1
[2017-28-09 10:00:05,000] Text of log number 2
[2017-28-09 10:00:10,000] Text of Exception number 1
at bla bla bla
at bla bla bla
at bla bla bla
at bla bla bla
[2017-28-09 10:00:15,000] Text of log number 4
$ awk '/\[.*\]/{d=0; if([=10=] ~ "Exception")d=1}d' test.txt
[2017-28-09 10:00:10,000] Text of Exception number 1
at bla bla bla
at bla bla bla
at bla bla bla
at bla bla bla
下面是@CWLiu 的解决方案和没有 "tail -f" 的解决方案。简单但可用,并且有 none 个僵尸进程或 tail -f 线程挂出:
#!/bin/bash
# Below I checked if last.log file exists
if [ ! -f /path/to/my/last.log ]; then
echo "" > /path/to/my/last.log
fi
# Thanks @CWLiu again for your below solution
awk '/\[.*\]/{d=0; if([=10=] ~ "Exception")d=1}d' /path/to/my/application.log > /path/to/my/now.log
# Creation differential file - now.log stores all exceptions, last.log stores all exceptions from previous check
diff /path/to/my/now.log /path/to/my/last.log > /path/to/my/tmp.log
diff /path/to/my/last.log /path/to/my/now.log | grep ">" | tr -d '>' >> /path/to/my/exception.log
现在我可以在 crontab 中添加这个文件的执行来检查,即每分钟是否发生异常:
* * * * * /path/to/my/file.sh