Search code examples
apache-sparkibm-cloud

spark-submit.sh returns 0 if job fails


I'm trying to automate running the spark-submit.sh script by another application. However, I noticed that even though spark-submit is aware of an error (it reports ERROR: Job failed.), it returns zero, which indicates that the spark-submit.sh script completed successfully. See:

./spark-submit.sh --vcap .vcap.json --deploy-mode cluster --master ...
To see the log, in another terminal window run the following command:
tail -f spark-submit_1463130693N.log

...

ERROR: Job failed.
Log file: spark-submit_1463130693N.log
snowch$ echo $?
0

I could parse the output from the spark-submit.sh script for the ERROR message, however that isn't very robust. Am I using the correct version?

snowch$ ./spark-submit.sh --version
spark-submit.sh  VERSION : '1.0.0.0.20160420.1'

Is this a bug with the spark-submit.sh script?


Solution

  • You're using the right version. It is a bug. It should return error code for that case.