Search code examples
hivehdfsimpalahue

Unable to insert data into partitioned table due to precision loss


I have created an external table partitioning on two columns. The two columns are 'country' and 'state' stored as SEQUENCEFILE.

I am now trying to load the data into the table using the following command in Impala run via Hue editor -

load data inpath '/usr/temp/input.txt' 
into table partitioned_user
partition (country = 'US', state = 'CA');

I am getting the following error -

AnalysisException: Partition key value may result in loss of precision. Would need to cast ''US'' to 'VARCHAR(64)' for partition column: country

What am I doing wrong? The table that I am inserting has columns such as and all are of type VARCHAR(64) - first_name,last_name,country,state.

The file input.txt contains the data only for the first two columns. Where am I going wrong?


Solution

  • Impala does not automatically convert from a larger type to a smaller one.. You must CAST() to a VARCHAR(64) before inserting to avoid such exception in Impala.

    partition (country = cast('US' as VARCHAR(64)), state = cast('CA' as VARCHAR(64)))
    

    Or use STRING datatype in table DDL instead.