如何更改 Spark 的默认 log4j 配置文件
How to change Spark's default log4j profile
我是 运行 Spyder 上的 PySpark IDE,每次都会出现此警告:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/02/15 17:05:12 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
22/02/15 17:05:29 WARN ProcfsMetricsGetter: Exception when trying to compute pagesize, as a result reporting of ProcessTree metrics is stopped
我试图编辑文件 C:\spark\spark-3.2.1-bin-hadoop2.7\conf\log4j.properties.template
以将警告级别更改为 'ERROR' 但它没有做任何事情
- 将
log4j.properties.template
重命名为 log4j.properties
- 确保
log4j.properties
在类路径中或在 $SPARK_HOME/conf/
下
我是 运行 Spyder 上的 PySpark IDE,每次都会出现此警告:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/02/15 17:05:12 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
22/02/15 17:05:29 WARN ProcfsMetricsGetter: Exception when trying to compute pagesize, as a result reporting of ProcessTree metrics is stopped
我试图编辑文件 C:\spark\spark-3.2.1-bin-hadoop2.7\conf\log4j.properties.template
以将警告级别更改为 'ERROR' 但它没有做任何事情
- 将
log4j.properties.template
重命名为log4j.properties
- 确保
log4j.properties
在类路径中或在$SPARK_HOME/conf/
下