I am trying to write a simple map only hadoop streaming job reading data from hdfs and pushing it to vertica.
I have written a shell script as below
./vsql -c "copy $TABLE from stdin delimiter E'\t' direct null '\\N';" -U $DBUSER -w $DBPWD -h $DBHOST -p $DBPORT
I have created oozie workflow as :
<action name="loadToVertica">
<map-reduce>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<prepare>
<delete path="${nameNode}/user/$USER/output/${exportDataDate}"/>
</prepare>
<streaming>
<mapper>shell export.sh</mapper>
</streaming>
<configuration>
<property>
<name>oozie.libpath</name>
<value>${wfsBasePath}/libs</value>
</property>
<property>
<name>mapred.input.dir</name>
<value>${nameNode}/user/$USER$/{exportDataDate}</value>
</property>
<property>
<name>mapred.output.dir</name>
<value>${nameNode}/user/$USER/output/${exportDataDate}</value>
</property>
<property>
<name>mapred.reduce.tasks</name>
<value>0</value>
</property>
</configuration>
<file>${wfsBasePath}/libs/${STREAMING_JAR_PATH}#${STREAMING_JAR_PATH}</file>
<file>${wfsBasePath}/libs/oozie-sharelib-streaming-4.2.0.2.5.3.0-37.jar#oozie-sharelib-streaming-4.2.0.2.5.3.0-37.jar</file>
<file>${wfsBasePath}/scripts/export.sh#export.sh</file>
<file>${wfsBasePath}/config/vsql#vsql</file>
</map-reduce>
<ok to="end"/>
<error to="end"/>
</action>
When i run this the status of job is Failed/Killed without any error message.
Adding -e in after #!/bin/sh helped me to trace what the actual error is.
After adding -e option in the script there was an error code in the logs.
After this the first line would look like :
#!/bin/sh -e