Search code examples
hiveapache-flinkflink-sql

Flinksql Create Hive catalog caused "Configured default database default doesn't exist in catalog myhive."


I have followed the instruction to install relevant libs and hive depends. However it's still not working when i create catalog. It seemed like the project is not stable yet. I have also try jdbc catalog and it also not working either

CREATE CATALOG myhive WITH (
'type' = 'hive',
'default-database' = 'default',
'hive-conf-dir' = '/opt/hive-conf'
);

Caused by: org.apache.flink.table.api.ValidationException: Could not execute CREATE CATALOG: (catalogName: [myhive], properties: [{hive-conf-dir=/opt/hive-conf, default-database=default, type=hive}])
at org.apache.flink.table.operations.ddl.CreateCatalogOperation.execute(CreateCatalogOperation.java:75) ~[flink-table-api-java-uber-1.18.0.jar:1.18.0]
at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1092) ~[flink-table-api-java-uber-1.18.0.jar:1.18.0]
at org.apache.flink.table.gateway.service.operation.OperationExecutor.callOperation(OperationExecutor.java:556) ~[flink-sql-gateway-1.18.0.jar:1.18.0]
at org.apache.flink.table.gateway.service.operation.OperationExecutor.executeOperation(OperationExecutor.java:444) ~[flink-sql-gateway-1.18.0.jar:1.18.0]
at org.apache.flink.table.gateway.service.operation.OperationExecutor.executeStatement(OperationExecutor.java:207) ~[flink-sql-gateway-1.18.0.jar:1.18.0]
at org.apache.flink.table.gateway.service.SqlGatewayServiceImpl.lambda$executeStatement$1(SqlGatewayServiceImpl.java:212) ~[flink-sql-gateway-1.18.0.jar:1.18.0]
at org.apache.flink.table.gateway.service.operation.OperationManager.lambda$submitOperation$1(OperationManager.java:119) ~[flink-sql-gateway-1.18.0.jar:1.18.0]
at org.apache.flink.table.gateway.service.operation.OperationManager$Operation.lambda$run$0(OperationManager.java:258) ~[flink-sql-gateway-1.18.0.jar:1.18.0]
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) ~[?:?]
at java.util.concurrent.FutureTask.run(Unknown Source) ~[?:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) ~[?:?]
at java.util.concurrent.FutureTask.run(Unknown Source) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) ~[?:?]
... 1 more
Caused by: org.apache.flink.table.catalog.exceptions.CatalogException: Configured default database default doesn't exist in catalog myhive.
    at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:309) ~[flink-connector-hive_2.12-1.18.0.jar:1.18.0]
    at org.apache.flink.table.catalog.CatalogManager.createCatalog(CatalogManager.java:309) ~[flink-table-api-java-uber-1.18.0.jar:1.18.0]
    at org.apache.flink.table.operations.ddl.CreateCatalogOperation.execute(CreateCatalogOperation.java:68) ~[flink-table-api-java-uber-1.18.0.jar:1.18.0]
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1092) ~[flink-table-api-java-uber-1.18.0.jar:1.18.0]
    at org.apache.flink.table.gateway.service.operation.OperationExecutor.callOperation(OperationExecutor.java:556) ~[flink-sql-gateway-1.18.0.jar:1.18.0]
    at org.apache.flink.table.gateway.service.operation.OperationExecutor.executeOperation(OperationExecutor.java:444) ~[flink-sql-gateway-1.18.0.jar:1.18.0]
    at org.apache.flink.table.gateway.service.operation.OperationExecutor.executeStatement(OperationExecutor.java:207) ~[flink-sql-gateway-1.18.0.jar:1.18.0]
    at org.apache.flink.table.gateway.service.SqlGatewayServiceImpl.lambda$executeStatement$1(SqlGatewayServiceImpl.java:212) ~[flink-sql-gateway-1.18.0.jar:1.18.0]
    at org.apache.flink.table.gateway.service.operation.OperationManager.lambda$submitOperation$1(OperationManager.java:119) ~[flink-sql-gateway-1.18.0.jar:1.18.0]
    at org.apache.flink.table.gateway.service.operation.OperationManager$Operation.lambda$run$0(OperationManager.java:258) ~[flink-sql-gateway-1.18.0.jar:1.18.0]
    at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) ~[?:?]
    at java.util.concurrent.FutureTask.run(Unknown Source) ~[?:?]
    at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) ~[?:?]
    at java.util.concurrent.FutureTask.run(Unknown Source) ~[?:?]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) ~[?:?]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) ~[?:?]
    ... 1 more

[ERROR] Could not execute SQL statement. Reason:
org.apache.flink.table.catalog.exceptions.CatalogException: Configured default database default doesn't exist in catalog myhive.

And my hive-site.xml below

 <?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
    <property>
        <name>javax.jdo.option.ConnectionURL</name>
        <value>jdbc:postgresql://.....:5432/metastore</value>
    </property>
    <property>
        <name>hive.metastore.uris</name>
        <value>thrift://.....:9083</value>
    </property>

    <property>
        <name>javax.jdo.option.ConnectionDriverName</name>
        <value>org.postgresql.Driver</value>
    </property>

    <property>
        <name>javax.jdo.option.ConnectionUserName</name>
        <value>hive</value>
    </property>

    <property>
        <name>javax.jdo.option.ConnectionPassword</name>
        <value>hive</value>
    </property>

    <property>
        <name>org.jpox.autoCreateSchema</name>
        <value>true</value>
    </property>

</configuration>

And i try to access my hive server, it does have the db

Logging initialized using configuration in file:/opt/hive/conf/hive-log4j2.properties Async: true
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
hive> show databases;
OK
abc
default
Time taken: 1.374 seconds, Fetched: 2 row(s)
hive>

Solution

  • Issue fixed. The HIVE Metastore version has to be same/relatively with flink-sql-connector-hive-<HIVE_RELATIVELY_VERSION>_2.12-1.14.6.jar This is very tedious as Documentation not really well.