Search code examples
scalaapache-sparkdataframerdd

Spark Tasks reading the data but not writing back


We are using the Spark 1.6 version and while running the jobs in Spark-shell,we observed that tasks are reading the data but not writing them back to complete the tasks as shown in below table

Address TaskTime TotalTask FailedTask succeededtask Shuffle/read Shuffle/write
   1       0         0         0            0           188KB/707   0.0B/670

Spark program is using 5 executors 5 GB of size and 3 cores Please suggest here


Solution

  • I have solved this issue by increasing the number of tasks for the partitions in cluster settings