Scala: abstract class: Compile Error: class X needs to be abstract, since: [error] it has n unimplemented members

Scala: abstract class: Compile Error: class X needs to be abstract, since: [error] it has n unimplemented members

嗨,我是 Scala 的新手,正在尝试 运行 这个简单的代码,但我无法编译它:

/* SimpleApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

import org.apache.spark._
import org.apache.spark.graphx._
import org.apache.spark.rdd.RDD

class Graph[VD, ED] {
  val vertices: VertexRDD[VD]
  val edges: EdgeRDD[ED]
}

object SimpleApp {
  def main(args: Array[String]) {
    val conf = new SparkConf().setAppName("Simple Application")
    val sc = new SparkContext(conf)

    // Create an RDD for the vertices
    val vertices: RDD[(VertexId, (Int, Int))] =
        sc.parallelize(Array((1L, (7,-1)), (2L, (3,-1)),
                       (3L, (2,-1)), (4L, (6,-1))))

    // Create an RDD for edges
    val relationships: RDD[Edge[Boolean]] =
        sc.parallelize(Array(Edge(1L, 2L, true), Edge(1L, 4L, true),
                      Edge(2L, 4L, true), Edge(3L, 1L, true), 
                   Edge(3L, 4L, true)))

   // Create the graph
   val graph = Graph(vertices, relationships)

   // Check the graph
   graph.vertices.collect.foreach(println)

   sc.stop()
   }
}

这是 sbt 文件:

name := "Simple Project"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0"

libraryDependencies += "org.apache.spark" %% "spark-graphx" % "0.9.0-incubating"

当我尝试编译它时,我得到:

$ C:\"Program Files (x86)"\sbt\bin\sbt package
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
[info] Set current project to Simple Project (in build file:/C:/spark/simple/)
[info] Compiling 1 Scala source to C:\spark\simple\target\scala-2.10\classes...
[error] C:\spark\simple\src\main\scala\SimpleApp.scala:10: class Graph needs to be abstract, since:
[error] it has 2 unimplemented members.
[error] /** As seen from class Graph, the missing signatures are as follows.
[error]  *  For convenience, these are usable as stub implementations.
[error]  */
[error]   val edges: org.apache.spark.graphx.EdgeRDD[ED] = ???
[error]   val vertices: org.apache.spark.graphx.VertexRDD[VD] = ???
[error] class Graph[VD, ED] {
[error]       ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 6 s, completed Jan 16, 2017 11:48:51 PM

我是 Scala 的新手,我只需要 运行 一些小而简单的代码,但我可以编译它。我试过将顶点和边设置为 _ 但后来我 got:unbound val 边的占位符参数。

[error] C:\spark\simple\src\main\scala\SimpleApp.scala:10: class Graph needs to be abstract, since:

[error] it has 2 unimplemented members.

从错误信息中可以看出。您需要为 class 定义中的不可变字段 verticesedges 提供值。您可以使用您喜欢的任何值在构造函数主体中初始化它们,例如:

class Graph[VD, ED] {
  val vertices: VertexRDD[VD] = /* calculation here */
  val edges: EdgeRDD[ED] = /* calculation here */
}

或将它们列为构造函数参数,以便其用户可以在实例化它们时提供值:

class Graph[VD, ED] (val vertices: VertexRDD[VD], val edges: EdgeRDD[ED])

实际上相当于:

class Graph[VD, ED] (val theVertices: VertexRDD[VD], val theEdges: EdgeRDD[ED])
{
    val vertices = theVertices
    val edges = theEdges
}

这样您就可以用两个未定义的方法定义 class。因此它要求将其定义为抽象的。

可能你想要这样的东西:

class Graph[VD, ED](
  val vertices: VertexRDD[VD],
  val edges: EdgeRDD[ED]) {
}

这样你定义了一个 class 有两个字段和一个默认构造函数,它采用 2 个参数(顶点和边)将相应的值分配给具有相同名称的字段。

放在那个地方的关键字 val 意味着您希望构造函数的那些参数可以访问,就好像它们是 class.

的字段一样

此外,如果您没有特定需求,用一个简单的元组来处理会更方便。