Search code examples
hadoopoozie

Running oozie in local mode gives error


I am trying to run a oozie job using below xml . However the action fails with the error:

Main class [org.apache.oozie.action.hadoop.SparkMain], exit code [101]

On Analysis of logs I observed that error was because of java.lang.ClassNotFoundException: Mainclass. However Mainclass exists in jar in hdfs location. The jar is specified in in xml below.Here is my code:

<action name="action1" cred="hive_credentials">
                <spark xmlns="uri:oozie:spark-action:0.2">
                        <job-tracker>${jobTracker}</job-tracker>
                        <name-node>${nameNode}</name-node>
                        <master>local[*]</master>
                        <name>name</name>
                        <class>Mainclass</class>
                        <jar>${jar1}</jar>
                        <spark-opts>
                                --files hive-site.xml --conf spark.yarn.security.tokens.hive.enabled=false
                        </spark-opts>
                        <arg>arg1</arg>
                        <file>${nameNode}/test/${wf:user()}/hive-site.xml</file>
                </spark>
                <ok to="end" />
                <error to="kill_job" />
        </action>

What could be the issue?


Solution

  • I resolved the issue,

    1) Creating a "lib" folder directly next to workflow xml

    2) Copying Mainclass jar to lib folder

    3) Specifying only jar name in tag and not fully qualified hdfs path