Answer Posted / Atul Chandra
A Worker Node in Apache Spark is a machine that runs tasks and executes user code. Each worker node contains one or more executors, which are the actual units of computation. When an application is submitted to Spark, Spark creates a Driver Program (which manages the execution) and Worker Nodes (where the tasks are executed).
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers