Python 使用 PyInotify 保存日志文件流
Python Persist Log File Stream with PyInotify
我在通过 pyinotify
及其线程持久保存日志文件写入流时遇到问题。我正在使用 pyinotify
监视目录中的 CLOSE_WRITE
文件事件。在我初始化 pyinotify
之前,我使用内置的 logging
模块创建了一个日志流,如下所示:
import os, logging
from logging import handlers
from logging.config import dictConfig
log_dir = './var/log'
name = 'com.sadmicrowave.tesseract'
LOG_SETTINGS = { 'version' : 1
,'handlers': { 'core': {
# make the logger a rotating file handler so the file automatically gets archived and a new one gets created, preventing files from becoming too large they are unmaintainable.
'class' : 'logging.handlers.RotatingFileHandler'
# by setting our logger to the DEBUG level (lowest level) we will include all other levels by default
,'level' : 'DEBUG'
# this references the 'core' handler located in the 'formatters' dict element below
,'formatter' : 'core'
# the path and file name of the output log file
,'filename' : os.path.join(log_dir, "%s.log" % name)
,'mode' : 'a'
# the max size we want to log file to reach before it gets archived and a new file gets created
,'maxBytes' : 100000
# the max number of files we want to keep in archive
,'backupCount' : 5 }
}
# create the formatters which are referenced in the handlers section above
,'formatters': {'core': {'format': '%(levelname)s %(asctime)s %(module)s|%(funcName)s %(lineno)d: %(message)s'
}
}
,'loggers' : {'root': {
'level' : 'DEBUG' # The most granular level of logging available in the log module
,'handlers' : ['core']
}
}
}
# use the built-in logger dict configuration tool to convert the dict to a logger config
dictConfig(LOG_SETTINGS)
# get the logger created in the config and named root in the 'loggers' section of the config
__log = logging.getLogger('root')
因此,在我的 __log
变量初始化后,它会立即运行,允许写入日志。接下来我想启动 pyinotify
实例,并希望使用以下 class 定义传递 __log
:
import asyncore, pyinotify
class Notify (object):
def __init__ (self, log=None, verbose=True):
wm = pyinotify.WatchManager()
wm.add_watch( '/path/to/folder/to/monitor/', pyinotify.IN_CLOSE_WRITE, proc_fun=processEvent(log, verbose) )
notifier = pyinotify.AsyncNotifier(wm, None)
asyncore.loop()
class processEvent (pyinotify.ProcessEvent):
def __init__ (self, log=None, verbose=True):
log.info('logging some cool stuff')
self.__log = log
self.__verbose = verbose
def process_IN_CLOSE_WRITE (self, event):
print event
在上面的实现中,我的 process_IN_CLOSE_WRITE
方法完全按照 pyinotify.AsyncNotifier
的预期触发;但是,logging some cool stuff
的日志行永远不会写入日志文件。
感觉跟通过pyinotify线程进程持久化文件流有关系;但是,我不确定如何解决这个问题。
有什么想法吗?
我可能找到了一个似乎有效的解决方案。不确定这是否是最好的方法,所以我暂时让 OP 保持打开状态,看看是否发布了任何其他想法。
我想我的 pyinotify.AsyncNotifier
设置处理有误。我将实现更改为:
class Notify (object):
def __init__ (self, log=None, verbose=True):
notifiers = []
descriptors = []
wm = pyinotify.WatchManager()
notifiers.append ( pyinotify.AsyncNotifier(wm, processEvent(log, verbose)) )
descriptors.append( wm.add_watch( '/path/to/folder/to/monitor/', pyinotify.IN_CLOSE_WRITE, proc_fun=processEvent(log, verbose), auto_add=True )
asyncore.loop()
现在,当我的包装器 class processEvents
在侦听器实例化时被触发,并且当一个事件从 CLOSE_WRITE
事件被触发时,log
对象适当维护和传递,可以接收写事件。
我在通过 pyinotify
及其线程持久保存日志文件写入流时遇到问题。我正在使用 pyinotify
监视目录中的 CLOSE_WRITE
文件事件。在我初始化 pyinotify
之前,我使用内置的 logging
模块创建了一个日志流,如下所示:
import os, logging
from logging import handlers
from logging.config import dictConfig
log_dir = './var/log'
name = 'com.sadmicrowave.tesseract'
LOG_SETTINGS = { 'version' : 1
,'handlers': { 'core': {
# make the logger a rotating file handler so the file automatically gets archived and a new one gets created, preventing files from becoming too large they are unmaintainable.
'class' : 'logging.handlers.RotatingFileHandler'
# by setting our logger to the DEBUG level (lowest level) we will include all other levels by default
,'level' : 'DEBUG'
# this references the 'core' handler located in the 'formatters' dict element below
,'formatter' : 'core'
# the path and file name of the output log file
,'filename' : os.path.join(log_dir, "%s.log" % name)
,'mode' : 'a'
# the max size we want to log file to reach before it gets archived and a new file gets created
,'maxBytes' : 100000
# the max number of files we want to keep in archive
,'backupCount' : 5 }
}
# create the formatters which are referenced in the handlers section above
,'formatters': {'core': {'format': '%(levelname)s %(asctime)s %(module)s|%(funcName)s %(lineno)d: %(message)s'
}
}
,'loggers' : {'root': {
'level' : 'DEBUG' # The most granular level of logging available in the log module
,'handlers' : ['core']
}
}
}
# use the built-in logger dict configuration tool to convert the dict to a logger config
dictConfig(LOG_SETTINGS)
# get the logger created in the config and named root in the 'loggers' section of the config
__log = logging.getLogger('root')
因此,在我的 __log
变量初始化后,它会立即运行,允许写入日志。接下来我想启动 pyinotify
实例,并希望使用以下 class 定义传递 __log
:
import asyncore, pyinotify
class Notify (object):
def __init__ (self, log=None, verbose=True):
wm = pyinotify.WatchManager()
wm.add_watch( '/path/to/folder/to/monitor/', pyinotify.IN_CLOSE_WRITE, proc_fun=processEvent(log, verbose) )
notifier = pyinotify.AsyncNotifier(wm, None)
asyncore.loop()
class processEvent (pyinotify.ProcessEvent):
def __init__ (self, log=None, verbose=True):
log.info('logging some cool stuff')
self.__log = log
self.__verbose = verbose
def process_IN_CLOSE_WRITE (self, event):
print event
在上面的实现中,我的 process_IN_CLOSE_WRITE
方法完全按照 pyinotify.AsyncNotifier
的预期触发;但是,logging some cool stuff
的日志行永远不会写入日志文件。
感觉跟通过pyinotify线程进程持久化文件流有关系;但是,我不确定如何解决这个问题。
有什么想法吗?
我可能找到了一个似乎有效的解决方案。不确定这是否是最好的方法,所以我暂时让 OP 保持打开状态,看看是否发布了任何其他想法。
我想我的 pyinotify.AsyncNotifier
设置处理有误。我将实现更改为:
class Notify (object):
def __init__ (self, log=None, verbose=True):
notifiers = []
descriptors = []
wm = pyinotify.WatchManager()
notifiers.append ( pyinotify.AsyncNotifier(wm, processEvent(log, verbose)) )
descriptors.append( wm.add_watch( '/path/to/folder/to/monitor/', pyinotify.IN_CLOSE_WRITE, proc_fun=processEvent(log, verbose), auto_add=True )
asyncore.loop()
现在,当我的包装器 class processEvents
在侦听器实例化时被触发,并且当一个事件从 CLOSE_WRITE
事件被触发时,log
对象适当维护和传递,可以接收写事件。