Spark 应用覆盖 yarn-site.xml 配置参数
Spark application override yarn-site.xml config parameters
提交 Spark 应用程序时,我需要覆盖 yarn-site.xml
中的一个 Yarn 配置参数。我可以将它作为额外参数传递给 spark-submit
吗?
我要覆盖的参数是yarn.nodemanager.vmem-check-enabled
您可以在使用 spark-submit 提交作业时使用 --conf
--conf "yarn.nodemanager.vmem-check-enabled"
或者你也可以在你的程序中设置代码为SparkSession.conf.set
来自文档
Configuration for a Spark application. Used to set various Spark
parameters as key-value pairs.
Most of the time, you would create a SparkConf object with new
SparkConf(), which will load values from any spark.* Java system
properties set in your application as well. In this case, parameters
you set directly on the SparkConf object take priority over system
properties.
For unit tests, you can also call new SparkConf(false) to skip loading
external settings and get the same configuration no matter what the
system properties are.
All setter methods in this class support chaining. For example, you
can write new SparkConf().setMaster("local").setAppName("My app").
提交 Spark 应用程序时,我需要覆盖 yarn-site.xml
中的一个 Yarn 配置参数。我可以将它作为额外参数传递给 spark-submit
吗?
我要覆盖的参数是yarn.nodemanager.vmem-check-enabled
您可以在使用 spark-submit 提交作业时使用 --conf
--conf "yarn.nodemanager.vmem-check-enabled"
或者你也可以在你的程序中设置代码为SparkSession.conf.set
来自文档
Configuration for a Spark application. Used to set various Spark parameters as key-value pairs.
Most of the time, you would create a SparkConf object with new SparkConf(), which will load values from any spark.* Java system properties set in your application as well. In this case, parameters you set directly on the SparkConf object take priority over system properties.
For unit tests, you can also call new SparkConf(false) to skip loading external settings and get the same configuration no matter what the system properties are.
All setter methods in this class support chaining. For example, you can write new SparkConf().setMaster("local").setAppName("My app").