Search code examples
hbaseapache-flinkversionflink-sqlpyflink

org.apache.flink.table.api.ValidationException: Unable to create a sink for writing table 'default_catalog.default_database.hTable'


I am trying to connect Flink 1.14.4 with HBase version 2.2.14; I am added Hbase SQL connector jar flink-sql-connector-hbase-2.2-1.15.2.jar , but for version 2.2.x becauce it is the last version of jar.

but I got the following error:

py4j.protocol.Py4JJavaError: An error occurred while calling o1.executeSql.
: org.apache.flink.table.api.ValidationException: Unable to create a sink for writing table 'default_catalog.default_database.hTable'.

Table options are:

'connector'='hbase-2.2'
'table-name'='test'
'zookeeper.quorum'='127.0.0.1:2181'
        at org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java:184)
        at org.apache.flink.table.planner.delegation.PlannerBase.getTableSink(PlannerBase.scala:388)
        at org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala:222)
        at org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$1.apply(PlannerBase.scala:182)
        at org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$1.apply(PlannerBase.scala:182)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
        at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
        at scala.collection.Iterator$class.foreach(Iterator.scala:891)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
        at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
        at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
        at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
        at scala.collection.AbstractTraversable.map(Traversable.scala:104)
        at org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:182)
        at org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1665)
        at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:752)
        at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:872)
        at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:742)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.flink.api.python.shaded.py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
        at org.apache.flink.api.python.shaded.py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
        at org.apache.flink.api.python.shaded.py4j.Gateway.invoke(Gateway.java:282)
        at org.apache.flink.api.python.shaded.py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
        at org.apache.flink.api.python.shaded.py4j.commands.CallCommand.execute(CallCommand.java:79)
        at org.apache.flink.api.python.shaded.py4j.GatewayConnection.run(GatewayConnection.java:238)
        at java.lang.Thread.run(Thread.java:750)
Caused by: java.lang.NoSuchMethodError: org.apache.flink.table.factories.DynamicTableFactory$Context.getPhysicalRowDataType()Lorg/apache/flink/table/types/DataType;
        at org.apache.flink.connector.hbase2.HBase2DynamicTableFactory.createDynamicTableSink(HBase2DynamicTableFactory.java:95)
        at org.apache.flink.table.factories.FactoryUtil.createTableSink(FactoryUtil.java:181)
        ... 28 more

My hbase table is defined:

sink_ddl = """
        CREATE TABLE hTable (
            datemin STRING,
            family2 ROW<datemax STRING>,
            family3 ROW<channel_title STRING, channel_id STRING>,
            PRIMARY KEY (datemin) NOT ENFORCED
        ) WITH (
          'connector' = 'hbase-2.2',
          'table-name' = 'test',
          'zookeeper.quorum' = '127.0.0.1:2181'
        )
        """

T created a view to select data for elements and to insert them in hTable:

table_env.create_temporary_view('table_api_table', table)
table_env.execute_sql("""
    INSERT INTO hTable
        SELECT
            datemin,
            ROW(datemax),
            ROW(channel_title, channel_id)
        FROM table_api_table
""").wait()

I see that Flink 1.14 dont't support Hbase enter image description here

So do I have to change hbase version?


Solution

  • Finally it's working !! I fixed this issue by doing the following:

    I edited hbase-env.sh :

    # Extra Java CLASSPATH elements.  Optional.
    export HBASE_CLASSPATH=/home/hadoop/hbase/conf
    

    I edited hbase-site.xml, so I added the following propertie:

      <property>
        <name>hbase.defaults.for.version.skip</name>
        <value>true</value>
      </property>
    

    Then editing the connector jar , indeed I unpackaged the jar and then I edited hbase-default.xml

    <property>
        <name>hbase.defaults.for.version.skip</name>
        <value>true</value>
        <description>Set to true to skip the 'hbase.defaults.for.version' check.
            Setting this to true can be useful in contexts other than
            the other side of a maven generation; i.e. running in an
            IDE.  You'll want to set this boolean to true to avoid
            seeing the RuntimeException complaint: "hbase-default.xml file
            seems to be for and old version of HBase (\${hbase.version}), this
            version is X.X.X-SNAPSHOT"</description>
    </property>
    

    and finally, moving the jar in flink lib folder (it's better than :

    table_env.get_config().get_configuration().set_string("pipeline.jars","file:///home/hadoop/hbase/conf/flink-sql-connector-hbase-2.2_2.11-1.14.4.jar")
    

    )

    this articles helped me a lot: https://www.cnblogs.com/panfeng412/archive/2012/07/22/hbase-exception-hbase-default-xml-file-seems-to-be-for-and-old-version-of-hbase.html

    https://blog.csdn.net/bokzmm/article/details/119882885