Search code examples
hadoophivethrift

Hive error when creating an external table (state=08S01,code=1)


I'm trying to create an external table in Hive, but keep getting the following error:

create external table foobar (a STRING, b STRING) row format delimited fields terminated by "\t" stored as textfile location "/tmp/hive_test_1375711405.45852.txt";
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask (state=08S01,code=1)
Error: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask (state=08S01,code=1)
Aborting command set because "force" is false and command failed: "create external table foobar (a STRING, b STRING) row format delimited fields terminated by "\t" stored as textfile location "/tmp/hive_test_1375711405.45852.txt";"

The contents of /tmp/hive_test_1375711405.45852.txt are:

abc\tdef

I'm connecting via the beeline command line interface, which uses Thrift HiveServer2.

System:

  • Hadoop 2.0.0-cdh4.3.0
  • Hive 0.10.0-cdh4.3.0
  • Beeline 0.10.0-cdh4.3.0
  • Client OS - Red Hat Enterprise Linux Server release 6.4 (Santiago)

Solution

  • The issue was that I was pointing the external table at a file in HDFS instead of a directory. The cryptic Hive error message really threw me off.

    The solution is to create a directory and put the data file in there. To fix this for the above example, you'd create a directory under /tmp/foobar and place hive_test_1375711405.45852.txt in it. Then create the table like so:

    create external table foobar (a STRING, b STRING) row format delimited fields terminated by "\t" stored as textfile location "/tmp/foobar";