Search code examples
hadoopbigdatahortonworks-data-platformhcatalog

Adding tables in Hive experiencing HCatClient error


I am running Hortonworks 2.2 sandbox on a Windows 7 host machine. I have allocated 2GB of memory to the virtual machine. Every time I try to create a table, I get the following error:

HCatClient error on create table: {"statement":"use default; create table kjdrg(a bigint, b bigint, c bigint) comment 'k' row format delimited fields terminated by ',';","error":"unable to create table: kjdrg","exec":{"stdout":"","stderr":"15/07/02 12:55:45 WARN conf.HiveConf: HiveConf of name hive.optimize.mapjoin.mapreduce does not exist\n15/07/02 12:55:45 WARN conf.HiveConf: HiveConf of name hive.heapsize does not exist\n15/07/02 12:55:45 WARN conf.HiveConf: HiveConf of name hive.server2.enable.impersonation does not exist\n15/07/02 12:55:45 WARN conf.HiveConf: HiveConf of name hive.auto.convert.sortmerge.join.noconditionaltask does not exist\nSLF4J: Class path contains multiple SLF4J bindings.\nSLF4J: Found binding in [jar:file:/usr/hdp/2.2.0.0-2041/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]\nSLF4J: Found binding in [jar:file:/usr/hdp/2.2.0.0-2041/hive/lib/hive-jdbc-0.14.0.2.2.0.0-2041-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]\nSLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.\nSLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]\n Command was terminated due to timeout(60000ms). See templeton.exec.timeout property","exitcode":143}} (error 500)

How can I address this?


Solution

  • Not that it matters to anyone since the question has so low views, still I'm posting the solution that worked for me finally. So the problem was that using Hcat via browser was using up too much RAM. Use PuTTY to ssh into the hive and then create tables, it works smoothly. No code as there was none involved in solving the problem.

    Yay, me! sad