Answer Posted / Ankur Shukla
In Apache Spark, an executor is a worker node that runs tasks assigned by the driver application. Each executor can handle multiple tasks concurrently and manages its own JVM heap space.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers