I am trying to write a custom codec to convert Cassandra columns of type timestamp
to org.joda.time.DateTime
.
I am building my project with sbt versions 0.13.13.
I wrote a test that serializes and deserializes a DateTime
object. When I run the test via the command line with sbt "test:testOnly *DateTimeCodecTest"
, the project builds and the test passes.
However, if I try to build the project inside Intellij, I receive the following error:
Error:(17, 22) overloaded method constructor TypeCodec with alternatives:
(x$1: com.datastax.driver.core.DataType,x$2: shade.com.datastax.spark.connector.google.common.reflect.TypeToken[org.joda.time.DateTime])com.datastax.driver.core.TypeCodec[org.joda.time.DateTime] <and>
(x$1: com.datastax.driver.core.DataType,x$2: Class[org.joda.time.DateTime])com.datastax.driver.core.TypeCodec[org.joda.time.DateTime]
cannot be applied to (com.datastax.driver.core.DataType, com.google.common.reflect.TypeToken[org.joda.time.DateTime])
object DateTimeCodec extends TypeCodec[DateTime](DataType.timestamp(), TypeToken.of(classOf[DateTime]).wrap()) {
Here is the codec:
import java.nio.ByteBuffer
import com.datastax.driver.core.exceptions.InvalidTypeException
import com.datastax.driver.core.{ DataType, ProtocolVersion, TypeCodec }
import com.google.common.reflect.TypeToken
import org.joda.time.{ DateTime, DateTimeZone }
/**
* Provides serialization between Cassandra types and org.joda.time.DateTime
*
* Reference for writing custom codecs in Scala:
* https://www.datastax.com/dev/blog/writing-scala-codecs-for-the-java-driver
*/
object DateTimeCodec extends TypeCodec[DateTime](DataType.timestamp(), TypeToken.of(classOf[DateTime]).wrap()) {
override def serialize(value: DateTime, protocolVersion: ProtocolVersion): ByteBuffer = {
if (value == null) return null
val millis: Long = value.getMillis
TypeCodec.bigint().serializeNoBoxing(millis, protocolVersion)
}
override def deserialize(bytes: ByteBuffer, protocolVersion: ProtocolVersion): DateTime = {
val millis: Long = TypeCodec.bigint().deserializeNoBoxing(bytes, protocolVersion)
new DateTime(millis).withZone(DateTimeZone.UTC)
}
// Do we need a formatter?
override def format(value: DateTime): String = value.getMillis.toString
// Do we need a formatter?
override def parse(value: String): DateTime = {
try {
if (value == null ||
value.isEmpty ||
value.equalsIgnoreCase("NULL")) throw new Exception("Cannot produce a DateTime object from empty value")
// Do we need a formatter?
else new DateTime(value)
} catch {
// TODO: Determine the more specific exception that would be thrown in this case
case e: Exception =>
throw new InvalidTypeException(s"""Cannot parse DateTime from "$value"""", e)
}
}
}
and here is the test:
import com.datastax.driver.core.ProtocolVersion
import org.joda.time.{ DateTime, DateTimeZone }
import org.scalatest.FunSpec
class DateTimeCodecTest extends FunSpec {
describe("Serialization") {
it("should serialize between Cassandra types and org.joda.time.DateTime") {
val now = new DateTime().withZone(DateTimeZone.UTC)
val result = DateTimeCodec.deserialize(
// TODO: Verify correct ProtocolVersion for DSE 5.0
DateTimeCodec.serialize(now, ProtocolVersion.V4), ProtocolVersion.V4
)
assertResult(now)(result)
}
}
}
I make extensive use of the debugger within Intellij as well as the ability to quickly run a single test using some hotkeys. Losing the ability to compile within the IDE is almost as bad as losing the ability to compile at all. Any help would be appreciated, and I am more than happy to provide any additional information about my project // environment if anyone needs it.
Edit, update:
The project compiles within IntelliJ if I provide an instance of com.google.common.reflect.TypeToken
as opposed to shade.com.datastax.spark.connector.google.common.reflect.TypeToken
.
However, this breaks the build within sbt.
I resolved the issue.
The issue stemmed from conflicting versions of spark-cassandra-connector
on the classpath. Both shaded and unshaded versions of the dependency were on the classpath, and removing the shaded dependency fixed the issue.