When running Spark applications, is it important to introduce Spark on every one of the hubs of YARN group?
Answer Posted / Chhavi Sagar
No, it's not necessary to install Spark on every node in a YARN cluster when running Spark applications. Instead, you should have at least one node (called the Spark Master) with the Spark application client and the ResourceManager component, as well as several nodes (called Spark Workers or Executors) that run the Spark Executor processes.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers