I am making JDBC connection to Denodo database using pyspark. The table that i am connecting to contains "TIMESTAMP_WITH_TIMEZONE" datatype for 2 columns. Since spark provides builtin jdbc connection to a handful of dbs only of which denodo is not a part, it is not able to recognize "TIMESTAMP_WITH_TIMEZONE" datatype and hence not able to map to any of its spark sql dataype. To overcome this i am providing my custom schema(c_schema here) but this is not working as well and i am getting the same error. Below is the code snippet.
c_schema="game start date TIMESTAMP,game end date TIMESTAMP"
df = spark.read.jdbc("jdbc_url", "schema.table_name",properties={"user": "user_name", "password": "password","customSchema":c_schema,"driver": "com.denodo.vdp.jdbc.Driver"})
Please let me know how shall i fix this.
For anyone else facing this issue while connecting to denodo using spark,use CAST function to convert the datatype "TIMESTAMP_WITH_TIMEZONE" into any other datatype like String,Date or Timestamp etc. I had posted this question on denodo community page too and i have attached its official response.
CAST("PLANNED START DATE" as DATE) as "PLANNED_START_DATE"