In which deployment mode can we Not add Nodes/workers to a cluster in Apache Spark 2.3.1
1.Spark Standalone 2.Mesos 3.Kubernetes 4.Yarn 5.Local Mode
i have installed Apache Spark 2.3.1 on my machine and have run it in Local Mode
in Local Mode can we add Nodes/workers to Apache Spark?
When master is Local, your program will run on single machine that is your edge node. To run it in distributed environment i.e. on cluster you need to select master as "Yarn".
When deployment mode is "client" (default) your edge node will become the master (where driver program will run). When deployment mode is "cluster", any of the healthy node from the cluster becomes master