即使使用 RotatingFileHandler,Django 也会一次记录到多个文件?
Django logging to multiple files at once even with RotatingFileHandler?
我将 Django 与日志配置一起使用,如下所示:
LOGGING = {
'version':1,
'disable_existing_loggers': False,
'formatters': {
'verbose': {
'format': '{asctime} {process:d} {thread:d} {levelname} {name} {module} {funcName} {message}',
'style': '{',
}
},
'handlers': {
'file': {
'level': 'INFO',
'class': 'logging.handlers.RotatingFileHandler',
'filename': BASE_DIR+'/logs/django_logs.log',
'backupCount': 14,
'maxBytes': 52428800,
'formatter': 'verbose'
}
},
'loggers': {
'': {
'handlers': ['file'],
'level': 'INFO'
}
},
}
我是 运行 16 个 django 进程,websockets 的 daphne 和基于 django rest 框架的正常 API 调用的 gunicorn 进程很少。但是当我查看日志时,会同时记录多个文件。例如,django_logs.1 .... django.logs.14 正在同时登录。我是否必须添加其他内容才能一次记录到一个文件并仅在其大小超过指定的日志文件大小时才旋转它?
有关额外信息,我正在使用 python 3.6.8 并且我在每个项目文件中初始化记录器如下:
import logging
logger = logging.getLogger(__name__)
来自docs.python.logging-cookbook:
Although logging is thread-safe, and logging to a single file from
multiple threads in a single process is supported, logging to a single
file from multiple processes is not supported, because there is no
standard way to serialize access to a single file across multiple
processes in Python. If you need to log to a single file from multiple
processes, one way of doing this is to have all the processes log to a
SocketHandler
, and have a separate process which implements a socket
server which reads from the socket and logs to file. (If you prefer,
you can dedicate one thread in one of the existing processes to
perform this function.)
食谱中也详细介绍了此方法的完整示例。 See this section.
文档还建议了一些替代此方法的方法:
Alternatively, you can use a Queue
and a QueueHandler
to send all
logging events to one of the processes in your multi-process
application
我已经测试了您的日志记录配置,当一个独特的进程正在记录到文件处理程序时,它按预期工作。仅当达到 maxBytes
时,文件才应旋转
我将 Django 与日志配置一起使用,如下所示:
LOGGING = {
'version':1,
'disable_existing_loggers': False,
'formatters': {
'verbose': {
'format': '{asctime} {process:d} {thread:d} {levelname} {name} {module} {funcName} {message}',
'style': '{',
}
},
'handlers': {
'file': {
'level': 'INFO',
'class': 'logging.handlers.RotatingFileHandler',
'filename': BASE_DIR+'/logs/django_logs.log',
'backupCount': 14,
'maxBytes': 52428800,
'formatter': 'verbose'
}
},
'loggers': {
'': {
'handlers': ['file'],
'level': 'INFO'
}
},
}
我是 运行 16 个 django 进程,websockets 的 daphne 和基于 django rest 框架的正常 API 调用的 gunicorn 进程很少。但是当我查看日志时,会同时记录多个文件。例如,django_logs.1 .... django.logs.14 正在同时登录。我是否必须添加其他内容才能一次记录到一个文件并仅在其大小超过指定的日志文件大小时才旋转它?
有关额外信息,我正在使用 python 3.6.8 并且我在每个项目文件中初始化记录器如下:
import logging
logger = logging.getLogger(__name__)
来自docs.python.logging-cookbook:
Although logging is thread-safe, and logging to a single file from multiple threads in a single process is supported, logging to a single file from multiple processes is not supported, because there is no standard way to serialize access to a single file across multiple processes in Python. If you need to log to a single file from multiple processes, one way of doing this is to have all the processes log to a
SocketHandler
, and have a separate process which implements a socket server which reads from the socket and logs to file. (If you prefer, you can dedicate one thread in one of the existing processes to perform this function.)
食谱中也详细介绍了此方法的完整示例。 See this section.
文档还建议了一些替代此方法的方法:
Alternatively, you can use a
Queue
and aQueueHandler
to send all logging events to one of the processes in your multi-process application
我已经测试了您的日志记录配置,当一个独特的进程正在记录到文件处理程序时,它按预期工作。仅当达到 maxBytes