Answer Posted / Dhermendra Kumar
In the context of Apache Spark, a worker node is a node in a cluster that runs executor processes. Executors perform tasks assigned by the application master and return their results back to the driver program. Worker nodes are responsible for processing data and can be managed by a resource manager like Apache Mesos or Hadoop YARN.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers