Search code examples
scalaapache-sparkabstract-classspark-graphx

Scala: abstract class: Compile Error: class X needs to be abstract, since: [error] it has n unimplemented members


Hi I'm extremely new to Scala and trying to run this simple code but I can't get it to compile:

/* SimpleApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

import org.apache.spark._
import org.apache.spark.graphx._
import org.apache.spark.rdd.RDD

class Graph[VD, ED] {
  val vertices: VertexRDD[VD]
  val edges: EdgeRDD[ED]
}

object SimpleApp {
  def main(args: Array[String]) {
    val conf = new SparkConf().setAppName("Simple Application")
    val sc = new SparkContext(conf)

    // Create an RDD for the vertices
    val vertices: RDD[(VertexId, (Int, Int))] =
        sc.parallelize(Array((1L, (7,-1)), (2L, (3,-1)),
                       (3L, (2,-1)), (4L, (6,-1))))

    // Create an RDD for edges
    val relationships: RDD[Edge[Boolean]] =
        sc.parallelize(Array(Edge(1L, 2L, true), Edge(1L, 4L, true),
                      Edge(2L, 4L, true), Edge(3L, 1L, true), 
                   Edge(3L, 4L, true)))

   // Create the graph
   val graph = Graph(vertices, relationships)

   // Check the graph
   graph.vertices.collect.foreach(println)

   sc.stop()
   }
}

And here's is the sbt file:

name := "Simple Project"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0"

libraryDependencies += "org.apache.spark" %% "spark-graphx" % "0.9.0-incubating"

When I try to compile it I get:

$ C:\"Program Files (x86)"\sbt\bin\sbt package
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
[info] Set current project to Simple Project (in build file:/C:/spark/simple/)
[info] Compiling 1 Scala source to C:\spark\simple\target\scala-2.10\classes...
[error] C:\spark\simple\src\main\scala\SimpleApp.scala:10: class Graph needs to be abstract, since:
[error] it has 2 unimplemented members.
[error] /** As seen from class Graph, the missing signatures are as follows.
[error]  *  For convenience, these are usable as stub implementations.
[error]  */
[error]   val edges: org.apache.spark.graphx.EdgeRDD[ED] = ???
[error]   val vertices: org.apache.spark.graphx.VertexRDD[VD] = ???
[error] class Graph[VD, ED] {
[error]       ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 6 s, completed Jan 16, 2017 11:48:51 PM

I'm really new to Scala and all I need is to run some small and simple code but I can get it to compile. I've tried setting the vertices and edges to _ but then I got:unbound placeholder parameter for val edges.


Solution

  • This way you're definining a class with two undefined methods. Therefore it asks to define it as abstract.

    Probably you want something like this:

    class Graph[VD, ED](
      val vertices: VertexRDD[VD],
      val edges: EdgeRDD[ED]) {
    }
    

    This way you're defining a class with two fields and a default constructor that taken 2 parameters(vertices and edges) assigns the respective value to the fields with the same name.

    The keyword val, put in that place, means that you want those parameters of the constructor to be accessible as if they were fields of the class.

    Also, if you don't have specific needs, handling this with a simple tuple would be more handy.