What are the roles and responsibilities of worker nodes in the Apache Spark cluster? Is Worker Node in Spark is same as Slave Node?
Answer Posted / Furakan Ali
Worker nodes in an Apache Spark cluster execute tasks assigned by the driver node. They process data, shuffle data between partitions, and store RDDs on their local disks. While the term 'slave node' is associated with other distributed computing systems like Hadoop, it is not used in Spark terminology.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers