Answer Posted / Mohd Nadeem
Spark can be used in three different modes with Hadoop: Standalone mode, Hadoop Batch mode, and Live mode. In standalone mode, the entire cluster is managed by Spark itself. In Hadoop Batch mode, Spark applications run on a YARN (Yet Another Resource Negotiator) cluster as map-reduce jobs. Lastly, in Live mode, Spark runs alongside with other map-reduce jobs to provide real-time data processing.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers