Search code examples
scalaspark-streaming

How to parse JSON data in Scala?


I am new to Scala. I want to parse JSON data in scala.

I want to loop this data and in each iteration extract data for id,v,q, and t from values

I am using below code to parse it to JSON

import scala.util.parsing.json._

val data =
  """
{
  "timestamp":
  1518501114949
  , "values":
  [
  {
    "id":
    "abc"
    , "v":
    0
    , "q":
    true
    , "t":
    1518501114487
  }
  ,
  {
    "id":
    "xyz"
    , "v":
    15
    , "q":
    true
    , "t":
    1518501114494
  }
  ]
}
"""

val parsed = JSON.parseFull(data)

I am getting output as below

 Some(Map(timestamp -> 1.518501114949E12, values -> List(Map(id -> abc, v -> 0.0, q -> true, t -> 1.518501114487E12), Map(id -> xyz, v -> 15.0, q -> true, t -> 1.518501114494E12), Map(id -> klm, v -> 12.6999998, q -> true, t -> 1.518501114487E12), Map(id -> 901.Hotmelt.PSA.0759_PSAM01_Vac, v -> 1.0, q -> true, t -> 1.518501114494E12))))

but I don't know how to loop and fetch all values after that

and I am not understanding why timestamp is getting converted to E12 values


Solution

  • The problem is that the parseFull returns an Option with an Any inside, so you first need to get rid of that:

    With this code below, you will keep the values:

    val listAsAny = parsed match {
      case Some(e:Map[Any,Any]) => e("values")
      case None => println("Failed.")
    }
    

    But they still as Any, so you can transform it as follows:

    val values = listAsAny.asInstanceOf[List[Map[String, Any]]]
    

    Now values is a List of maps with the following values, and you can get the values inside as you will do with a regular List

    List(Map(id -> abc, v -> 0.0, q -> true, t -> 1.518501114487E12), Map(id -> xyz, v -> 15.0, q -> true, t -> 1.518501114494E12))
    

    For instance, to retrieve the ids you can do:

    values.map(_("id"))
    

    And the result will be:

    List(abc, xyz)