Search code examples
apache-sparkhbaseapache-spark-sqlspark-streamingapache-phoenix

I want to collect the data frame column values in an array list to conduct some computations, is it possible?


I am loading data from phoenix through this:

val tableDF = sqlContext.phoenixTableAsDataFrame("Hbtable", Array("ID", "distance"), conf = configuration)

and want to carry out the following computation on the column values distance:

val list=Array(10,20,30,40,10,20,0,10,20,30,40,50,60)//list of values from the column distance
val first=list(0)
val last=list(list.length-1)
var m = 0; 
for (a <- 0 to list.length-2) {
  if (list(a + 1) < list(a) && list(a+1)>=0)
  {
     m = m + list(a)
  } 
}
val totalDist=(m+last-first)

Solution

  • You can do something like this. It returns Array[Any]

    `val array = df.select("distance").rdd.map(r => r(0)).collect()
    

    If you want to get the data type properly, then you can use. It returns the Array[Int]

    val array = df.select("distance").rdd.map(r => r(0).asInstanceOf[Int]).collect()