I have been using Spark + Python to finish some works, it's great, but I have a question in my mind:
Where is the spark job of transformation and action done?
Is transformation job done in Spark Master (or Driver) while action job is done in Workers (Executors), or both of them are done in Workers (Executors)
Workers (aka slaves) are running Spark instances where executors live to execute tasks.
Transformations are performed at the worker, when the action method is called the computed data is brought back to the driver.
An application in Spark
is executed in three steps:
1.Create RDD graph, i.e. DAG (directed acyclic graph)
of RDDs to represent entire computation.
2.Create stage graph, i.e. a DAG of stages
that is a logical execution plan based on the RDD graph. Stages are created by breaking the RDD graph at shuffle boundaries.
3.Based on the plan, schedule and execute
tasks on workers.