Search code examples
apache-sparkhadoop-yarnmesos

Mesos Configuration with existing Apache Spark standalone cluster


I am a beginner in Apache-spark! I have setup Spark standalone cluster using 4 PCs. I want to use Mesos with existing Spark standalone cluster. But I read that I need to install Mesos first then configure the spark.

I have also seen the Documentation of Spark on setting with Mesos, but it is not helpful for me.

So how to configure Mesos with existing spark standalone cluster?


Solution

  • Mesos is an alternative cluster manager to standalone Spark manger. You don't use it with, you use it instead of.

    • to create Mesos cluster follow https://mesos.apache.org/gettingstarted/
    • make sure to distribute Mesos native library is available on the machine you use to submit jobs
    • for cluster mode start Mesos dispatcher (sbin/start-mesos-dispatcher.sh).
    • submit application using Mesos master URI (client mode) or dispatcher URI (cluster mode).