Is it necessary to install spark on all the nodes of a YARN cluster while running Apache Spark on YARN ?
Answer Posted / Swadesh Kumar Niranjan
When using Apache Spark with YARN (Yet Another Resource Negotiator), you only need to install Spark's dependency libraries on the worker nodes (nodes that run executors). The master node, which runs the application master, must have Spark installed. It is not necessary to install Spark itself on all worker nodes.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers