Answer Posted / Sudhir Kumar Singh
In Apache Spark, the Driver is the main program that creates a SparkContext, specifies the dataset and operations to be performed on it, and coordinates the execution of tasks across multiple nodes. Executors are worker processes that run on each node, execute the tasks assigned by the driver, and return the results back to the driver.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers