Search code examples
apache-sparkmesos

running multiple Spark jobs on a Mesos cluster


I would like to run multiple spark jobs on my Mesos cluster, and have all spark jobs share the same spark framework. Is this possible? I have tried running the MesosClusterDispatcher and have the spark jobs connect to the dispatcher, but each spark job launches its own "Spark Framework" (I have tried running both client-mode and cluster-mode). Is this the expected behaviour? Is it possible to share the same spark-framework among multiple spark jobs?


Solution

  • It is normal and it's the expected behaviour.

    In Mesos as far as I know, SparkDispatcher is in charge of allocate resources for your Spark Driver which will act as a framework. Once Spark driver has been allocated, it is responsible for talk to Mesos and accept offers to allocate the executors where tasks will be executed.