Search code examples
databaseamazon-s3hivelocationdatabricks

Creating hive tables in S3 bucket using databricks


I want to set a global location for all the tables I create beforehand Ex:

Create table table name stored as parquet location 's3_bucket/db_name.db'

I am doing this in each table that I create,

I am looking for a setting like

use database 's3_database_address'

So that I can eliminate the repeatative cmds


Solution

  • Every managed table in Hive will by default be created in warehouse directory. You don't need to give location unless you want them to point to somewhere else than the default warehouse location. Read this: http://www.devgrok.com/2018/12/using-s3-hive-metastore-with-emr.html?m=1