I'm using Spark job-server for working with job management. I need to create 10 jobs e.g.. I can create 10 separate jars for it and call it next way:
curl -d "" 'job-server-host:8090/jobs?appName=my_job_number_1&classPath=com.spark.jobs.MainClass'
curl -d "" 'job-server-host:8090/jobs?appName=my_job_number_2&classPath=com.spark.jobs.MainClass'
...
Or I can create only one jar with 10 job classes:
curl -d "" 'job-server-host:8090/jobs?appName=my_alone_job&classPath=com.spark.jobs.Job1'
curl -d "" 'job-server-host:8090/jobs?appName=my_alone_job&classPath=com.spark.jobs.Job2'
...
Which variant is more preferable and why?
Main motive to use spark-job-server is Spark job-management and context management.
It all depends on your requirement. If you think that those jobs are related and can be grouped, you can put all those in single jar or creating different-2 packages for related jobs, rather than creating separate jars and use the same App and context for those jobs.