我的 windows 系统上的错误 运行 spark-shell 和 pyspark

ERROR running spark-shell and pyspark on my windows system

谁能帮我解决这个问题?我一直在尝试在我的机器上安装 运行 spark 以便能够在 scala 和 pyspark 中做一些工作,但是当我尝试 运行 时似乎总是遇到这个问题 运行 spark-shell 在我的机器上并安装了这个错误 java 8/11 不确定为什么不断出现错误,在 maven 和 sbt 上也面临一些构建错误。我在 youtube 上遵循了一些安装指南,但大部分都没有解释发生的错误。

C:\Spark\spark-3.2.1-bin-hadoop3.2\bin>spark-shell
    WARNING: An illegal reflective access operation has occurred
    WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/C:/Spark/spark-3.2.1-bin-hadoop3.2/jars/spark-unsafe_2.12-3.2.1.jar) to constructor java.nio.DirectByteBuffer(long,int)
    WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
    WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
    WARNING: All illegal access operations will be denied in a future release
    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
    Setting default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
    22/02/24 16:42:59 ERROR SparkContext: Error initializing SparkContext.
    java.lang.reflect.InvocationTargetException
            at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
            at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
            at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
            at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
            at org.apache.spark.executor.Executor.addReplClassLoaderIfNeeded(Executor.scala:909)
            at org.apache.spark.executor.Executor.<init>(Executor.scala:160)
            at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:64)
            at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:132)
            at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:220)
            at org.apache.spark.SparkContext.<init>(SparkContext.scala:581)
            at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)
            at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate(SparkSession.scala:949)
            at scala.Option.getOrElse(Option.scala:189)
            at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
            at org.apache.spark.repl.Main$.createSparkSession(Main.scala:106)
            at $line3.$read$$iw$$iw.<init>(<console>:15)
            at $line3.$read$$iw.<init>(<console>:42)
            at $line3.$read.<init>(<console>:44)
            at $line3.$read$.<init>(<console>:48)
            at $line3.$read$.<clinit>(<console>)
            at $line3.$eval$.$print$lzycompute(<console>:7)
            at $line3.$eval$.$print(<console>:6)
            at $line3.$eval.$print(<console>)
            at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
            at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.base/java.lang.reflect.Method.invoke(Method.java:566)
            at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:747)
            at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1020)
            at scala.tools.nsc.interpreter.IMain.$anonfun$interpret(IMain.scala:568)
            at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
            at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
            at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
            at scala.tools.nsc.interpreter.IMain.loadAndRunReq(IMain.scala:567)
            at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:594)
            at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:564)
            at scala.tools.nsc.interpreter.IMain.$anonfun$quietRun(IMain.scala:216)
            at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:206)
            at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:216)
            at org.apache.spark.repl.SparkILoop.$anonfun$initializeSpark(SparkILoop.scala:83)
            at scala.collection.immutable.List.foreach(List.scala:431)
            at org.apache.spark.repl.SparkILoop.$anonfun$initializeSpark(SparkILoop.scala:83)
            at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
            at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:97)
            at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:83)
            at org.apache.spark.repl.SparkILoop.$anonfun$process(SparkILoop.scala:165)
            at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
            at scala.tools.nsc.interpreter.ILoop.$anonfun$mumly(ILoop.scala:166)
            at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:206)
            at scala.tools.nsc.interpreter.ILoop.mumly(ILoop.scala:163)
            at org.apache.spark.repl.SparkILoop.loopPostInit(SparkILoop.scala:153)
            at org.apache.spark.repl.SparkILoop.$anonfun$process(SparkILoop.scala:221)
            at org.apache.spark.repl.SparkILoop.withSuppressedSettings(SparkILoop.scala:189)
            at org.apache.spark.repl.SparkILoop.startup(SparkILoop.scala:201)
            at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:236)
            at org.apache.spark.repl.Main$.doMain(Main.scala:78)
            at org.apache.spark.repl.Main$.main(Main.scala:58)
            at org.apache.spark.repl.Main.main(Main.scala)
            at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
            at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.base/java.lang.reflect.Method.invoke(Method.java:566)
            at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
            at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
            at org.apache.spark.deploy.SparkSubmit.doRunMain(SparkSubmit.scala:180)
            at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
            at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
            at org.apache.spark.deploy.SparkSubmit$$anon.doSubmit(SparkSubmit.scala:1043)
            at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
            at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    Caused by: java.net.URISyntaxException: Illegal character in path at index 44: spark://ITEM-S92418.emea.msad.sopra:51852/C:\classes
            at java.base/java.net.URI$Parser.fail(URI.java:2913)
            at java.base/java.net.URI$Parser.checkChars(URI.java:3084)
            at java.base/java.net.URI$Parser.parseHierarchical(URI.java:3166)
            at java.base/java.net.URI$Parser.parse(URI.java:3114)
            at java.base/java.net.URI.<init>(URI.java:600)
            at org.apache.spark.repl.ExecutorClassLoader.<init>(ExecutorClassLoader.scala:57)
            ... 70 more
    22/02/24 16:42:59 ERROR Utils: Uncaught exception in thread main
    java.lang.NullPointerException
            at org.apache.spark.scheduler.local.LocalSchedulerBackend.org$apache$spark$scheduler$local$LocalSchedulerBackend$$stop(LocalSchedulerBackend.scala:173)
            at org.apache.spark.scheduler.local.LocalSchedulerBackend.stop(LocalSchedulerBackend.scala:144)
            at org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:927)
            at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2567)
            at org.apache.spark.SparkContext.$anonfun$stop(SparkContext.scala:2086)
            at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1442)
            at org.apache.spark.SparkContext.stop(SparkContext.scala:2086)
            at org.apache.spark.SparkContext.<init>(SparkContext.scala:677)
            at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)
            at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate(SparkSession.scala:949)
            at scala.Option.getOrElse(Option.scala:189)
            at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
            at org.apache.spark.repl.Main$.createSparkSession(Main.scala:106)
            at $line3.$read$$iw$$iw.<init>(<console>:15)
            at $line3.$read$$iw.<init>(<console>:42)
            at $line3.$read.<init>(<console>:44)
            at $line3.$read$.<init>(<console>:48)
            at $line3.$read$.<clinit>(<console>)
            at $line3.$eval$.$print$lzycompute(<console>:7)
            at $line3.$eval$.$print(<console>:6)
            at $line3.$eval.$print(<console>)
            at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
            at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.base/java.lang.reflect.Method.invoke(Method.java:566)
            at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:747)
            at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1020)
            at scala.tools.nsc.interpreter.IMain.$anonfun$interpret(IMain.scala:568)
            at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
            at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
            at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
            at scala.tools.nsc.interpreter.IMain.loadAndRunReq(IMain.scala:567)
            at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:594)
            at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:564)
            at scala.tools.nsc.interpreter.IMain.$anonfun$quietRun(IMain.scala:216)
            at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:206)
            at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:216)
            at org.apache.spark.repl.SparkILoop.$anonfun$initializeSpark(SparkILoop.scala:83)
            at scala.collection.immutable.List.foreach(List.scala:431)
            at org.apache.spark.repl.SparkILoop.$anonfun$initializeSpark(SparkILoop.scala:83)
            at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
            at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:97)
            at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:83)
            at org.apache.spark.repl.SparkILoop.$anonfun$process(SparkILoop.scala:165)
            at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
            at scala.tools.nsc.interpreter.ILoop.$anonfun$mumly(ILoop.scala:166)
            at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:206)
            at scala.tools.nsc.interpreter.ILoop.mumly(ILoop.scala:163)
            at org.apache.spark.repl.SparkILoop.loopPostInit(SparkILoop.scala:153)
            at org.apache.spark.repl.SparkILoop.$anonfun$process(SparkILoop.scala:221)
            at org.apache.spark.repl.SparkILoop.withSuppressedSettings(SparkILoop.scala:189)
            at org.apache.spark.repl.SparkILoop.startup(SparkILoop.scala:201)
            at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:236)
            at org.apache.spark.repl.Main$.doMain(Main.scala:78)
            at org.apache.spark.repl.Main$.main(Main.scala:58)
            at org.apache.spark.repl.Main.main(Main.scala)
            at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
            at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.base/java.lang.reflect.Method.invoke(Method.java:566)
            at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
            at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
            at org.apache.spark.deploy.SparkSubmit.doRunMain(SparkSubmit.scala:180)
            at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
            at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
            at org.apache.spark.deploy.SparkSubmit$$anon.doSubmit(SparkSubmit.scala:1043)
            at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
            at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    22/02/24 16:42:59 WARN MetricsSystem: Stopping a MetricsSystem that is not running
    22/02/24 16:42:59 ERROR Main: Failed to initialize Spark session.
    java.lang.reflect.InvocationTargetException
            at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
            at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
            at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
            at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
            at org.apache.spark.executor.Executor.addReplClassLoaderIfNeeded(Executor.scala:909)
            at org.apache.spark.executor.Executor.<init>(Executor.scala:160)
            at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:64)
            at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:132)
            at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:220)
            at org.apache.spark.SparkContext.<init>(SparkContext.scala:581)
            at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)
            at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate(SparkSession.scala:949)
            at scala.Option.getOrElse(Option.scala:189)
            at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
            at org.apache.spark.repl.Main$.createSparkSession(Main.scala:106)
            at $line3.$read$$iw$$iw.<init>(<console>:15)
            at $line3.$read$$iw.<init>(<console>:42)
            at $line3.$read.<init>(<console>:44)
            at $line3.$read$.<init>(<console>:48)
            at $line3.$read$.<clinit>(<console>)
            at $line3.$eval$.$print$lzycompute(<console>:7)
            at $line3.$eval$.$print(<console>:6)
            at $line3.$eval.$print(<console>)
            at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
            at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.base/java.lang.reflect.Method.invoke(Method.java:566)
            at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:747)
            at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1020)
            at scala.tools.nsc.interpreter.IMain.$anonfun$interpret(IMain.scala:568)
            at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
            at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
            at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
            at scala.tools.nsc.interpreter.IMain.loadAndRunReq(IMain.scala:567)
            at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:594)
            at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:564)
            at scala.tools.nsc.interpreter.IMain.$anonfun$quietRun(IMain.scala:216)
            at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:206)
            at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:216)
            at org.apache.spark.repl.SparkILoop.$anonfun$initializeSpark(SparkILoop.scala:83)
            at scala.collection.immutable.List.foreach(List.scala:431)
            at org.apache.spark.repl.SparkILoop.$anonfun$initializeSpark(SparkILoop.scala:83)
            at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
            at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:97)
            at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:83)
            at org.apache.spark.repl.SparkILoop.$anonfun$process(SparkILoop.scala:165)
            at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
            at scala.tools.nsc.interpreter.ILoop.$anonfun$mumly(ILoop.scala:166)
            at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:206)
            at scala.tools.nsc.interpreter.ILoop.mumly(ILoop.scala:163)
            at org.apache.spark.repl.SparkILoop.loopPostInit(SparkILoop.scala:153)
            at org.apache.spark.repl.SparkILoop.$anonfun$process(SparkILoop.scala:221)
            at org.apache.spark.repl.SparkILoop.withSuppressedSettings(SparkILoop.scala:189)
            at org.apache.spark.repl.SparkILoop.startup(SparkILoop.scala:201)
            at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:236)
            at org.apache.spark.repl.Main$.doMain(Main.scala:78)
            at org.apache.spark.repl.Main$.main(Main.scala:58)
            at org.apache.spark.repl.Main.main(Main.scala)
            at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
            at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
            at java.base/java.lang.reflect.Method.invoke(Method.java:566)
            at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
            at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
            at org.apache.spark.deploy.SparkSubmit.doRunMain(SparkSubmit.scala:180)
            at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
            at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
            at org.apache.spark.deploy.SparkSubmit$$anon.doSubmit(SparkSubmit.scala:1043)
            at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
            at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    Caused by: java.net.URISyntaxException: Illegal character in path at index 44: spark://ITEM-S92418.emea.msad.sopra:51852/C:\classes
            at java.base/java.net.URI$Parser.fail(URI.java:2913)
            at java.base/java.net.URI$Parser.checkChars(URI.java:3084)
            at java.base/java.net.URI$Parser.parseHierarchical(URI.java:3166)
            at java.base/java.net.URI$Parser.parse(URI.java:3114)
            at java.base/java.net.URI.<init>(URI.java:600)
            at org.apache.spark.repl.ExecutorClassLoader.<init>(ExecutorClassLoader.scala:57)
            ... 70 more
    22/02/24 16:42:59 ERROR Utils: Uncaught exception in thread shutdown-hook-0
    java.lang.ExceptionInInitializerError
            at org.apache.spark.executor.Executor.stop(Executor.scala:333)
            at org.apache.spark.executor.Executor.$anonfun$stopHookReference(Executor.scala:76)
            at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
            at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll(ShutdownHookManager.scala:188)
            at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
            at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2019)
            at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll(ShutdownHookManager.scala:188)
            at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
            at scala.util.Try$.apply(Try.scala:213)
            at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
            at org.apache.spark.util.SparkShutdownHookManager$$anon.run(ShutdownHookManager.scala:178)
            at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
            at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
            at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
            at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
            at java.base/java.lang.Thread.run(Thread.java:834)
    Caused by: java.lang.NullPointerException
            at org.apache.spark.shuffle.ShuffleBlockPusher$.<init>(ShuffleBlockPusher.scala:465)
            at org.apache.spark.shuffle.ShuffleBlockPusher$.<clinit>(ShuffleBlockPusher.scala)
            ... 16 more
    22/02/24 16:42:59 WARN ShutdownHookManager: ShutdownHook '' failed, java.util.concurrent.ExecutionException: java.lang.ExceptionInInitializerError
    java.util.concurrent.ExecutionException: java.lang.ExceptionInInitializerError
            at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122)
            at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:205)
            at org.apache.hadoop.util.ShutdownHookManager.executeShutdown(ShutdownHookManager.java:124)
            at org.apache.hadoop.util.ShutdownHookManager.run(ShutdownHookManager.java:95)
    Caused by: java.lang.ExceptionInInitializerError
            at org.apache.spark.executor.Executor.stop(Executor.scala:333)
            at org.apache.spark.executor.Executor.$anonfun$stopHookReference(Executor.scala:76)
            at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
            at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll(ShutdownHookManager.scala:188)
            at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
            at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2019)
            at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll(ShutdownHookManager.scala:188)
            at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
            at scala.util.Try$.apply(Try.scala:213)
            at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
            at org.apache.spark.util.SparkShutdownHookManager$$anon.run(ShutdownHookManager.scala:178)
            at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
            at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
            at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
            at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
            at java.base/java.lang.Thread.run(Thread.java:834)
    Caused by: java.lang.NullPointerException
            at org.apache.spark.shuffle.ShuffleBlockPusher$.<init>(ShuffleBlockPusher.scala:465)
            at org.apache.spark.shuffle.ShuffleBlockPusher$.<clinit>(ShuffleBlockPusher.scala)
            ... 16 more

“安装 Java 8 而不是 Java 11,后者会在 Spark 中发出此类警告。”

来自