Search code examples
clouderasqoopooziehuecloudera-quickstart-vm

Oozie-sqoop workflow hanging with Heart beat issue in cloudera


I'm trying to run a simple sqoop import using oozie from hue ( Cloudera VM ).Few seconds after submitting, the job gets hung with heart beat issue for ever, I did some search and found this thread https://community.cloudera.com/t5/Batch-Processing-and-Workflow/Oozie-launcher-never-ends/td-p/13330, I added the XML properties that is mentioned in all the below yarn-site.xml files not knowing which specific file, but no use, I'm still facing the same issue can anyone give some insights on this?

/etc/hive/conf.cloudera.hive/yarn-site.xml
/etc/hadoop/conf.empty/yarn-site.xml
/etc/hadoop/conf.pseudo/yarn-site.xml
/etc/spark/conf.cloudera.spark_on_yarn/yarn-conf/yarn-site.xml
/etc/hive/conf.cloudera.hive/yarn-site.xml

job log

'12480 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  - Beginning import of order_items
13225 [main] WARN  org.apache.sqoop.mapreduce.JobBase  - SQOOP_HOME is unset. May not be able to find all job dependencies.
16314 [main] INFO  org.apache.sqoop.mapreduce.db.DBInputFormat  - Using read commited transaction isolation
18408 [main] INFO  org.apache.hadoop.mapreduce.Job  - The url to track the job: http://quickstart.cloudera:8088/proxy/application_1484596399739_0002/
18409 [main] INFO  org.apache.hadoop.mapreduce.Job  - Running job: job_1484596399739_0002
25552 [main] INFO  org.apache.hadoop.mapreduce.Job  - Job job_1484596399739_0002 running in uber mode : false
25553 [main] INFO  org.apache.hadoop.mapreduce.Job  -  map 0% reduce 0%
Heart beat
Heart beat

workflow XML

<workflow-app name="Oozie_Test1" xmlns="uri:oozie:workflow:0.5">
    <start to="sqoop-e57e"/>
    <kill name="Kill">
        <message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <action name="sqoop-e57e">
        <sqoop xmlns="uri:oozie:sqoop-action:0.2">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <command>import --m 1 --connect jdbc:mysql://quickstart.cloudera:3306/retail_db --username=retail_dba --password=cloudera --table order_items --hive-database sqoopimports --create-hive-table --hive-import --hive-table sqoop_hive_order_items</command>
            <file>/user/oozie/share/lib/mysql-connector-java-5.1.34-bin.jar#mysql-connector-java-5.1.34-bin.jar</file>
        </sqoop>
        <ok to="End"/>
        <error to="Kill"/>
    </action>
    <end name="End"/>
</workflow-app>

Thanks Mx


Solution

  • this thread helped me to resolve the issue

    https://community.cloudera.com/t5/Batch-Processing-and-Workflow/Oozie-sqoop-action-in-CDH-5-2-Heart-beat-issue/td-p/22181/page/2

    after getting through this error I got stuck in "Failure to launch flume" issue and this thread helped me to fix the issue

    oozie Sqoop action fails to import data to hive