Search code examples
apache-sparkhivedatabricksazure-databrickshive-metastore

How to setup external hive meta store in databricks with below configuration?


I tried the below code to create an external metastore to the azure sql server

BUt getting error that it can not be instantiated and a hive version issue.

spark.hadoop.javax.jdo.option.ConnectionDriverName com.microsoft.sqlserver.jdbc.SQLServerDriver spark.hadoop.javax.jdo.option.ConnectionURL .database.windows.net:1433;Initial Catalog=hivemetastore;

# Skip this one if <hive-version> is 0.13.x.
spark.sql.hive.metastore.version biltin


spark.hadoop.javax.jdo.option.ConnectionUserName <user>
datanucleus.fixedDatastore false
spark.hadoop.javax.jdo.option.ConnectionPassword <pwd>



spark.sql.hive.metastore.jars /databricks/hive_metastore_jars/*
datanucleus.schema.autoCreateTables true

hive.metastore.schema.verification false
hive.metastore.schema.verification.record.version false


spark.databricks.cluster.profile singleNode
spark.databricks.delta.preview.enabled true
spark.master local[*, 4]

Solution

  • Please Find a Solution that works for me. I used the below code which is working perfectly fine. On Runtime 7.3 LTS

    enter image description here