Search code examples
hadoophbasebackuphortonworks-data-platform

HBase Incremental Backup failed on HDP


Created a "test" table in HBase to test incremetal backup feature on HDP.

    hbase(main):002:0> create 'test', 'cf'
    0 row(s) in 1.4690 seconds

    hbase(main):003:0> put 'test', 'row1', 'cf:a', 'value1'
    0 row(s) in 0.1480 seconds

    hbase(main):004:0> put 'test', 'row2', 'cf:b', 'value2'
    0 row(s) in 0.0070 seconds

    hbase(main):005:0> put 'test', 'row3', 'cf:c', 'value3'
    0 row(s) in 0.0120 seconds

    hbase(main):006:0> put 'test', 'row3', 'cf:c', 'value4'
    0 row(s) in 0.0070 seconds

    hbase(main):010:0> scan 'test'       
    ROW                   COLUMN+CELL                                               
    row1                 column=cf:a, timestamp=1317945279379, value=value1        
    row2                 column=cf:b, timestamp=1317945285731, value=value2        
    row3                 column=cf:c, timestamp=1317945301466, value=value4        
    3 row(s) in 0.0250 seconds

Now i have taken a full backup using below in it's success

hbase backup create full hdfs://12.3.4.56:8020/tmp/full test -w 3

Now I want to test the "incremetal" backup on the above "test" table. So what I did :

put 'test', 'row123', 'cf:a', 'newValue'

Now when I am doing the below, it' getting falied

hbase backup create incremental hdfs://12.3.4.56:8020/tmp/full

Error:

Backup session finished. Status: FAILURE
2017-06-14 09:52:58,853 ERROR [main] util.AbstractHBaseTool: Error running command-line tool
org.apache.hadoop.ipc.RemoteException(java.lang.NullPointerException):
        at org.apache.hadoop.hbase.backup.master.FullTableBackupProcedure.cleanupTargetDir(FullTableBackupProcedure.java:205)
        at org.apache.hadoop.hbase.backup.master.FullTableBackupProcedure.failBackup(FullTableBackupProcedure.java:279)
        at org.apache.hadoop.hbase.backup.master.IncrementalTableBackupProcedure.executeFromState(IncrementalTableBackupProcedure.java:164)
        at org.apache.hadoop.hbase.backup.master.IncrementalTableBackupProcedure.executeFromState(IncrementalTableBackupProcedure.java:54)
        at org.apache.hadoop.hbase.procedure2.StateMachineProcedure.execute(StateMachineProcedure.java:107)
        at org.apache.hadoop.hbase.procedure2.Procedure.doExecute(Procedure.java:443)
        at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execProcedure(ProcedureExecutor.java:934)
        at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execLoop(ProcedureExecutor.java:736)
        at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.execLoop(ProcedureExecutor.java:689)
        at org.apache.hadoop.hbase.procedure2.ProcedureExecutor.access$200(ProcedureExecutor.java:73)
        at org.apache.hadoop.hbase.procedure2.ProcedureExecutor$1.run(ProcedureExecutor.java:416)

Updated:

In this below link it's mentione as "Backups and restores should be run as the hbase superuser (which is called “hbase” by default). What does it mean ? I am simply running the above back commands from a simple user with root access. Please suggest.

https://hortonworks.com/blog/coming-hdp-2-5-incremental-backup-restore-apache-hbase-apache-phoenix/

I tried to change the permission for hdfs files (tmp/full), but no use.


Solution

  • I am using Kerberos, so after doing kinit as the principal running hbase, incremental backup was working for me.

    If you are not using Kerberos, first switch to the HBase user (e.g. 'su - hbase').