Search code examples
apache-sparkmesos

Is it possible to run multiple Spark applications on a mesos cluster?


I have a Mesos cluster with 1 Master and 3 slaves (with 2 cores and 4GB RAM each) that has a Spark application already up and running. I wanted to run another application on the same cluster, as the CPU and Memory utilization isn't high. Regardless, when I try to run the new Application, I get the error:

16/02/25 13:40:18 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory

I guess the new process is not getting any CPU as the old one occupies all 6. I have tried enabling dynamic allocation, making the spark app Fine grained. Assigning numerous combinations of executor cores and number of executors. What I am missing here? Is it possible to run a Mesos Cluster with multiple Spark Frameworks at all?


Solution

  • You can try setting spark.cores.max to limit the number of CPUs used by each Spark driver, which will free up some resources.

    Docs: https://spark.apache.org/docs/latest/configuration.html#scheduling