When submitting spark streaming program using spark-submit(YARN mode) it keep polling the status and never exit
Is there any option in spark-submit to exit after the submission?
===why this trouble me===
The streaming program will run forever and i don't need the status update
I can ctrl+c to stop it if i start it manually but i have lots of streaming context to start and i need to start them using script
I can put the spark-submit program in background, but after lots of background java process created, the user corresponding to, will not able to run any other java process because JVM cannot create GC thread
Interesting. I never thought about this issue. Not sure there is a clean way to do this, but I simply kill the submit process on the machine and the yarn job continues to run until you stop it specifically. So you can create a script that execute the spark submit and then kills it. When you will actually wanna stop the job use yarn -kill. Dirty but works.