Caused by: java.lang.NoClassDefFoundError: org/apache/spark/sql/types/StructType while it is imported
Caused by: java.lang.NoClassDefFoundError: org/apache/spark/sql/types/StructType while it is imported
我正在使用 Spark SQL 依赖项,就像已经导入它的数据结构一样,但没有将 SPark 上下文传递给 Scala 的主要方法。为什么我会收到此异常?
Caused by: java.lang.NoClassDefFoundError: org/apache/spark/sql/types/StructType
代码:
package io.sekai.industry.streaming
import org.apache.spark.sql.types
import org.apache.spark.sql.types.{LongType, StringType, StructField, StructType}
class NonSparkSensorsGeneration {
val schema = StructType(
StructField("id", LongType, nullable = false) :: //always skip this if using the operator
StructField("Energy Data", StringType, nullable = false, metadata = new types.MetadataBuilder().putString("mqtt", "ESP_02/Energy Data").putString("label", "EnergyData").build()) ::
Nil)
def readFromCSVFile(shortPath: String): Unit= {
println("Preparing to load the csv file from jar resource " + shortPath)
val input = getClass.getResourceAsStream(shortPath)
println("Loaded file from resource " + shortPath)
println("Input stream from resource " + shortPath + " details: " + input.toString)
val source = scala.io.Source.fromInputStream(input)
val data = source.getLines.map(_.split("\t")).toArray
source.close
println(data.getClass)
}
def main(args: Array[String]): Unit = {
readFromCSVFile("generation")
}
}
那么class继承自对象:
package io.sekai.industry.streaming
object NonSparkSensorsGenerationJob extends NonSparkSensorsGeneration
和运行为:
"C:\Program Files\AdoptOpenJDK\jdk-12.0.1.12-hotspot\bin\java.exe" "-javaagent:D:\idea\IntelliJ IDEA Community Edition 2020.1.2\lib\idea_rt.jar=50417:D:\idea\IntelliJ IDEA Community Edition 2020.1.2\bin" -Dfile.encoding=UTF-8 @C:\Users\eljah32\AppData\Local\Temp\idea_arg_file1955225673 .industry.streaming.NonSparkSensorsGenerationJob
它运行正常,当我删除 val schema
变量时
您可能忘记在 IDE 中启用 Add dependencies with "provided" scope to classpath
标志(我提到的标志来自 IntelliJ IDEA)
我正在使用 Spark SQL 依赖项,就像已经导入它的数据结构一样,但没有将 SPark 上下文传递给 Scala 的主要方法。为什么我会收到此异常?
Caused by: java.lang.NoClassDefFoundError: org/apache/spark/sql/types/StructType
代码:
package io.sekai.industry.streaming
import org.apache.spark.sql.types
import org.apache.spark.sql.types.{LongType, StringType, StructField, StructType}
class NonSparkSensorsGeneration {
val schema = StructType(
StructField("id", LongType, nullable = false) :: //always skip this if using the operator
StructField("Energy Data", StringType, nullable = false, metadata = new types.MetadataBuilder().putString("mqtt", "ESP_02/Energy Data").putString("label", "EnergyData").build()) ::
Nil)
def readFromCSVFile(shortPath: String): Unit= {
println("Preparing to load the csv file from jar resource " + shortPath)
val input = getClass.getResourceAsStream(shortPath)
println("Loaded file from resource " + shortPath)
println("Input stream from resource " + shortPath + " details: " + input.toString)
val source = scala.io.Source.fromInputStream(input)
val data = source.getLines.map(_.split("\t")).toArray
source.close
println(data.getClass)
}
def main(args: Array[String]): Unit = {
readFromCSVFile("generation")
}
}
那么class继承自对象:
package io.sekai.industry.streaming
object NonSparkSensorsGenerationJob extends NonSparkSensorsGeneration
和运行为:
"C:\Program Files\AdoptOpenJDK\jdk-12.0.1.12-hotspot\bin\java.exe" "-javaagent:D:\idea\IntelliJ IDEA Community Edition 2020.1.2\lib\idea_rt.jar=50417:D:\idea\IntelliJ IDEA Community Edition 2020.1.2\bin" -Dfile.encoding=UTF-8 @C:\Users\eljah32\AppData\Local\Temp\idea_arg_file1955225673 .industry.streaming.NonSparkSensorsGenerationJob
它运行正常,当我删除 val schema
变量时
您可能忘记在 IDE 中启用 Add dependencies with "provided" scope to classpath
标志(我提到的标志来自 IntelliJ IDEA)