Search code examples
apache-sparkcluster-computingmaster-slaveapache-spark-1.5

Can I have a master and worker on same node?


I have a 3 node spark standalone cluster and on the master node I also have a worker. When I submit a app to the cluster the two other workers start RUNNING, but the worker on the master node stay with status LOADING and eventually another worker is launched on one of the other machines.

Is having a worker and a master on the same node being the problem ? If yes, is there a way to workout this problem or I should never have a worker and a master on the same node ?

P.S. The machines have 8 cores each and the workers are set to use 7 and not all of the RAM


Solution

  • It is possible to have a machine hosting both Workers and a Master.

    Is it possible that you misconfigured the spark-env.sh on that specific machine?