Search code examples
hadoophivehuecloudera-manager

Hive queries not working when passing .hql file using -f hive option


I have a wired problem and have searched everywhere and can't seem to get an answer. I am running cloudera 4.6 on a single node and am using local mysql db for hive metastore database. I have many hive tables with data inside that I'm able to query using Apache HUE Hive UI. I can also run queries from command line intermittently getting a

FAILED: Error in metadata: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

about half the time. When ever I try to pass .hql to hive like below I will get above error everytime.

hive -f test.hql 

I also see this error whenever I interact with HIVE via an oozie workflow. I originally had a postgresql local metastore db that was having similar errors.

Below is my hive-site.xml. Any help to get rid of this error would be greatly appreciated.

enter code here

<xml>
<?xml version="1.0" encoding="UTF-8"?>

<!--Autogenerated by Cloudera CM on 2014-07-25T21:18:21.918Z-->
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost:3306/metastore?useUnicode=true&amp;characterEncoding=UTF-  8</value>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>notmypassword</value>
</property>
<property>
<name>hive.metastore.local</name>
<value>true</value>
</property>
<property>
<name>datanucleus.autoCreateSchema</name>
<value>false</value>
</property>
<property>
<name>datanucleus.metadata.validate</name>
<value>false</value>
</property>
<property>
<name>datanucleus.fixedDatastore</name>
<value>true</value>
</property>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>/data/user/hive/warehouse</value>
</property>
<property>
<name>hive.warehouse.subdir.inherit.perms</name>
<value>true</value>
</property>
<property>
<name>mapred.reduce.tasks</name>
<value>-1</value>
</property>
<property>
<name>hive.exec.reducers.bytes.per.reducer</name>
<value>1073741824</value>
</property>
<property>
<name>hive.exec.reducers.max</name>
<value>999</value>
</property>
<property>
<name>hive.metastore.execute.setugi</name>
<value>true</value>
</property>
<property>
<name>hive.support.concurrency</name>
<value>true</value>
</property>
<property>
<name>hive.zookeeper.quorum</name>
<value>els-f14847</value>
</property>
<property>
<name>hive.zookeeper.client.port</name>
<value>2181</value>
</property>
<property>
<name>hive.zookeeper.namespace</name>
<value>hive_zookeeper_namespace_hive1</value>
</property>
<property>
<name>hive.metastore.server.min.threads</name>
<value>200</value>
</property>
<property>
<name>hive.metastore.server.max.threads</name>
<value>100000</value>
</property>
<property>
<name>datanucleus.autoStartMechanism</name>
<value>SchemaTable</value>
</property>
</configuration>
</xml>

Solution

  • Don't you have more information about the DDLTask error? In Hue it should appear in the middle of the log tab.

    Usually DDL fails when you are missing some underlying HDFS permissions. Are you doing some ALTER/CREATE tables in your script? What it its content?