Search code examples
apache-sparktaskbroadcastexecutor

how to get the spark broadcast variable in the executor? spark-core


here, I have one app which has two jobs. And in the first job, i want to set the broadcast such as setting the broadcast variable "true", accessing the broadcast in the executor. and in the second job, i want to set the broadcast variable "false". and How to achieve the requirement? my code is:

val conf = new SparkConf()
val sc = new SparkContext(conf)     
var setCapture = true
sc.broadcast(setCapture)
val file = lc.textFile("file" ,2)
val flatMap = file.flatMap(line => line.split(" ")).map(word => (word, 1)).reduceByKey(_ + _)
val report = counts.collect()
setCapture = false
sc.broadcast(setCapture)
val packageResult = sc.parallelize(report).filter(_._1 == "package")
packageResult.collect.foreach(println)

and I want to access the broadcast variable " setCapture " in the

org.apache.spark.scheduler.ResultTask,

org.apache.spark.rdd.HadoopRDD,

org.apache.spark.util.collection.ExternalAppendOnlyMap,

org.apache.spark.shuffle.hash.HashShuffleWriter.

What should i do?


Solution

  • From Spark documentation

    Broadcast variables are created from a variable v by calling SparkContext.broadcast(v). The broadcast variable is a wrapper around v, and its value can be accessed by calling the value method. The code below shows this:

    scala> val broadcastVar = sc.broadcast(Array(1, 2, 3)) 
    broadcastVar: org.apache.spark.broadcast.Broadcast[Array[Int]] = Broadcast(0)
    
    scala> broadcastVar.value
    res0: Array[Int] = Array(1, 2, 3)
    

    After the broadcast variable is created, it should be used instead of the value v in any functions run on the cluster so that v is not shipped to the nodes more than once. In addition, the object v should not be modified after it is broadcast in order to ensure that all nodes get the same value of the broadcast variable (e.g. if the variable is shipped to a new node later).