当 Kafka lib 用作 Keycloak 模块时如何停止 Kafka DEBUG 日志
How to stop Kafka DEBUG logs when Kafka lib is used as Keycloak module
keycloak 模块(包括 kafka 生产者)部署到 "keycloak-4.8.1.Final" 服务器。
我的问题:
即使我使用 INFO
日志级别,DEBUG
消息也会从 Kafka 库中记录。所以我想从 Keyclaok 日志文件中停止混乱的 Kafka 调试日志。有人可以支持我解决这个问题吗?
我怀疑 Keycloak (jboss-logging) 和 Kafka ( slf4j) 它自己。
示例日志:
10:10:40,642 INFO [stdout] (kafka-producer-network-thread | InternalUserProvisioningProducer) 47473973 [kafka-producer-network-thread | InternalUserProvisioningProducer] DEBUG org.apache.kafka.clients.NetworkClient - [Producer clientId=InternalUserProvisioningProducer] Sending metadata request (type=MetadataRequest, topics=) to node localhost:9092 (id: 0 rack: null)
10:10:40,644 INFO [stdout] (kafka-producer-network-thread | InternalUserProvisioningProducer) 47473975 [kafka-producer-network-thread | InternalUserProvisioningProducer] DEBUG org.apache.kafka.clients.Metadata - Updated cluster metadata version 28 to Cluster(id = 5N8ICZgiS-GewacYHMDtlg, nodes = [localhost:9092 (id: 0 rack: null)], partitions = [])
Kafka 库(在 pom.xml
中):
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.12</artifactId>
<version>1.0.1</version>
</dependency>
Keycloak 模块配置(module.xml
):
<module xmlns="urn:jboss:module:1.5" name="com.my.core.internal-user-authenticator-module">
<properties>
<property name="jboss.api" value="private"/>
</properties>
<resources>
<resource-root path="internal-user-authenticator-module-0.0.1-SNAPSHOT.jar"/>
</resources>
<dependencies>
<module name="org.keycloak.keycloak-core"/>
<module name="org.keycloak.keycloak-common"/>
<module name="org.keycloak.keycloak-services"/>
<module name="org.keycloak.keycloak-server-spi"/>
<module name="org.keycloak.keycloak-server-spi-private"/>
<module name="javax.api"/>
<module name="javax.ws.rs.api"/>
<module name="javax.persistence.api"/>
<module name="org.jboss.resteasy.resteasy-jaxrs"/>
<module name="com.sun.xml.bind"/>
<module name="javax.xml.bind.api"/>
<module name="org.jboss.resteasy.resteasy-jaxb-provider"/>
<module name="org.wildfly.security.elytron"/>
<module name="org.bouncycastle"/>
<module name="com.fasterxml.jackson.core.jackson-core" export="true"/>
<module name="com.fasterxml.jackson.core.jackson-databind" export="true"/>
</dependencies>
keycloak 日志记录配置(在 standalone.xml
中):
<subsystem xmlns="urn:jboss:domain:logging:6.0">
<console-handler name="CONSOLE">
<level name="INFO"/>
<formatter>
<named-formatter name="COLOR-PATTERN"/>
</formatter>
</console-handler>
<periodic-rotating-file-handler name="FILE" autoflush="true">
<formatter>
<named-formatter name="PATTERN"/>
</formatter>
<file relative-to="jboss.server.log.dir" path="server.log"/>
<suffix value=".yyyy-MM-dd"/>
<append value="true"/>
</periodic-rotating-file-handler>
<logger category="com.arjuna">
<level name="WARN"/>
</logger>
<logger category="org.jboss.as.config">
<level name="DEBUG"/>
</logger>
<logger category="sun.rmi">
<level name="WARN"/>
</logger>
<root-logger>
<level name="INFO"/>
<handlers>
<handler name="CONSOLE"/>
<handler name="FILE"/>
</handlers>
</root-logger>
<formatter name="PATTERN">
<pattern-formatter pattern="%d{yyyy-MM-dd HH:mm:ss,SSS} %-5p [%c] (%t) %s%e%n"/>
</formatter>
<formatter name="COLOR-PATTERN">
<pattern-formatter pattern="%K{level}%d{HH:mm:ss,SSS} %-5p [%c] (%t) %s%e%n"/>
</formatter>
</subsystem>
谢谢。
您还需要确保生产者端的日志级别也设置为 INFO
。在 log4j.properties
文件中你应该有类似于
的东西
log4j.rootLogger=INFO, stderr
log4j.appender.stderr=org.apache.log4j.ConsoleAppender
log4j.appender.stderr.layout=org.apache.log4j.PatternLayout
log4j.appender.stderr.layout.ConversionPattern=[%d] %p %m (%c)%n
log4j.appender.stderr.Target=System.err
并将此文件传递给您的 Kafka Producer:
-Dlog4j.configuration=file:/path/to/log4j.properties
keycloak 模块(包括 kafka 生产者)部署到 "keycloak-4.8.1.Final" 服务器。
我的问题:
即使我使用 INFO
日志级别,DEBUG
消息也会从 Kafka 库中记录。所以我想从 Keyclaok 日志文件中停止混乱的 Kafka 调试日志。有人可以支持我解决这个问题吗?
我怀疑 Keycloak (jboss-logging) 和 Kafka ( slf4j) 它自己。
示例日志:
10:10:40,642 INFO [stdout] (kafka-producer-network-thread | InternalUserProvisioningProducer) 47473973 [kafka-producer-network-thread | InternalUserProvisioningProducer] DEBUG org.apache.kafka.clients.NetworkClient - [Producer clientId=InternalUserProvisioningProducer] Sending metadata request (type=MetadataRequest, topics=) to node localhost:9092 (id: 0 rack: null)
10:10:40,644 INFO [stdout] (kafka-producer-network-thread | InternalUserProvisioningProducer) 47473975 [kafka-producer-network-thread | InternalUserProvisioningProducer] DEBUG org.apache.kafka.clients.Metadata - Updated cluster metadata version 28 to Cluster(id = 5N8ICZgiS-GewacYHMDtlg, nodes = [localhost:9092 (id: 0 rack: null)], partitions = [])
Kafka 库(在 pom.xml
中):
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.12</artifactId>
<version>1.0.1</version>
</dependency>
Keycloak 模块配置(module.xml
):
<module xmlns="urn:jboss:module:1.5" name="com.my.core.internal-user-authenticator-module">
<properties>
<property name="jboss.api" value="private"/>
</properties>
<resources>
<resource-root path="internal-user-authenticator-module-0.0.1-SNAPSHOT.jar"/>
</resources>
<dependencies>
<module name="org.keycloak.keycloak-core"/>
<module name="org.keycloak.keycloak-common"/>
<module name="org.keycloak.keycloak-services"/>
<module name="org.keycloak.keycloak-server-spi"/>
<module name="org.keycloak.keycloak-server-spi-private"/>
<module name="javax.api"/>
<module name="javax.ws.rs.api"/>
<module name="javax.persistence.api"/>
<module name="org.jboss.resteasy.resteasy-jaxrs"/>
<module name="com.sun.xml.bind"/>
<module name="javax.xml.bind.api"/>
<module name="org.jboss.resteasy.resteasy-jaxb-provider"/>
<module name="org.wildfly.security.elytron"/>
<module name="org.bouncycastle"/>
<module name="com.fasterxml.jackson.core.jackson-core" export="true"/>
<module name="com.fasterxml.jackson.core.jackson-databind" export="true"/>
</dependencies>
keycloak 日志记录配置(在 standalone.xml
中):
<subsystem xmlns="urn:jboss:domain:logging:6.0">
<console-handler name="CONSOLE">
<level name="INFO"/>
<formatter>
<named-formatter name="COLOR-PATTERN"/>
</formatter>
</console-handler>
<periodic-rotating-file-handler name="FILE" autoflush="true">
<formatter>
<named-formatter name="PATTERN"/>
</formatter>
<file relative-to="jboss.server.log.dir" path="server.log"/>
<suffix value=".yyyy-MM-dd"/>
<append value="true"/>
</periodic-rotating-file-handler>
<logger category="com.arjuna">
<level name="WARN"/>
</logger>
<logger category="org.jboss.as.config">
<level name="DEBUG"/>
</logger>
<logger category="sun.rmi">
<level name="WARN"/>
</logger>
<root-logger>
<level name="INFO"/>
<handlers>
<handler name="CONSOLE"/>
<handler name="FILE"/>
</handlers>
</root-logger>
<formatter name="PATTERN">
<pattern-formatter pattern="%d{yyyy-MM-dd HH:mm:ss,SSS} %-5p [%c] (%t) %s%e%n"/>
</formatter>
<formatter name="COLOR-PATTERN">
<pattern-formatter pattern="%K{level}%d{HH:mm:ss,SSS} %-5p [%c] (%t) %s%e%n"/>
</formatter>
</subsystem>
谢谢。
您还需要确保生产者端的日志级别也设置为 INFO
。在 log4j.properties
文件中你应该有类似于
log4j.rootLogger=INFO, stderr
log4j.appender.stderr=org.apache.log4j.ConsoleAppender
log4j.appender.stderr.layout=org.apache.log4j.PatternLayout
log4j.appender.stderr.layout.ConversionPattern=[%d] %p %m (%c)%n
log4j.appender.stderr.Target=System.err
并将此文件传递给您的 Kafka Producer:
-Dlog4j.configuration=file:/path/to/log4j.properties