Is it necessary to start Hadoop to run any Apache Spark Application ?
Answer Posted / Sudam Kumar
It is not always necessary to start Hadoop to run a Spark application. While Spark was originally designed to work with Hadoop as its underlying file system and resource management framework, it can also be used independently on other cluster managers like Mesos or YARN. In such cases, Spark does not require Hadoop to be running.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers