Answer Posted / Aditya Kumar Thakur
"Apache Spark supports multiple Cluster Managers to manage the distributed execution of tasks. Some common ones are:
1. Standalone: A simple and self-contained Spark cluster management system that doesn't require any external resources.
2. Hadoop YARN: Apache Hadoop’s resource management system, which allows multiple applications to share a single Hadoop cluster.
3. Mesos: A scalable, distributed systems framework for running applications across clusters of commodity hardware, supported by both Spark Standalone and YARN."
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers