Search code examples
apache-sparkcassandraapache-spark-sqlspark-cassandra-connector

How to use java.time.LocalDate in Cassandra query from Spark?


We have a table in Cassandra with column start_time of type date.

When we execute following code:

val resultRDD = inputRDD.joinWithCassandraTable(KEY_SPACE,TABLE)
   .where("start_time = ?", java.time.LocalDate.now)

We get following error:

com.datastax.spark.connector.types.TypeConversionException: Cannot convert object 2016-10-13 of type class java.time.LocalDate to com.datastax.driver.core.LocalDate.
at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:45)
at com.datastax.spark.connector.types.TypeConverter$$anonfun$convert$1.apply(TypeConverter.scala:43)
at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$$anonfun$convertPF$14.applyOrElse(TypeConverter.scala:449)
at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43)
at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:439)
at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56)
at com.datastax.spark.connector.types.TypeConverter$LocalDateConverter$.convert(TypeConverter.scala:439)
at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter$$anonfun$convertPF$29.applyOrElse(TypeConverter.scala:788)
at com.datastax.spark.connector.types.TypeConverter$class.convert(TypeConverter.scala:43)
at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.com$datastax$spark$connector$types$NullableTypeConverter$$super$convert(TypeConverter.scala:771)
at com.datastax.spark.connector.types.NullableTypeConverter$class.convert(TypeConverter.scala:56)
at com.datastax.spark.connector.types.TypeConverter$OptionToNullConverter.convert(TypeConverter.scala:771)
at com.datastax.spark.connector.writer.BoundStatementBuilder$$anonfun$8.apply(BoundStatementBuilder.scala:93)

I've tried to register custom converters according to documentation:

object JavaLocalDateToCassandraLocalDateConverter extends TypeConverter[com.datastax.driver.core.LocalDate] {
  def targetTypeTag = typeTag[com.datastax.driver.core.LocalDate]
  def convertPF = { 
      case ld: java.time.LocalDate => com.datastax.driver.core.LocalDate.fromYearMonthDay(ld.getYear, ld.getMonthValue, ld.getDayOfMonth) 
      case _ => com.datastax.driver.core.LocalDate.fromYearMonthDay(1971, 1, 1) 
  }
}

object CassandraLocalDateToJavaLocalDateConverter extends TypeConverter[java.time.LocalDate] {
  def targetTypeTag = typeTag[java.time.LocalDate]
  def convertPF = { case ld: com.datastax.driver.core.LocalDate => java.time.LocalDate.of(ld.getYear(), ld.getMonth(), ld.getDay()) 
                    case _ => java.time.LocalDate.now 
  }
}

TypeConverter.registerConverter(JavaLocalDateToCassandraLocalDateConverter)
TypeConverter.registerConverter(CassandraLocalDateToJavaLocalDateConverter)

But it didn't help.

How can I use JDK8 Date/Time classes in Cassandra queries executed from Spark?


Solution

  • I think the simplest thing to do in a where clause like this is to just call

    sc
     .cassandraTable("test","test")
     .where("start_time = ?", java.time.LocalDate.now.toString)
     .collect`
    

    And just pass in the string since that will be a well defined conversion.

    There seems to be an issue in the TypeConverters where your converter is not taking precedence over the built in converter. I'll take a quick look.

    --Edit--

    It seems like the registered converters are not being properly transferred to the Executors. In Local mode the code works as expected which makes me think this is a serialization issue. I would open a ticket on the Spark Cassandra Connector for this issue.