Is it necessary to install spark on all the nodes of a YARN cluster while running Apache Spark on YARN ?
Answer / Swadesh Kumar Niranjan
When using Apache Spark with YARN (Yet Another Resource Negotiator), you only need to install Spark's dependency libraries on the worker nodes (nodes that run executors). The master node, which runs the application master, must have Spark installed. It is not necessary to install Spark itself on all worker nodes.
| Is This Answer Correct ? | 0 Yes | 0 No |
What are the disadvantages of using Spark?
What is the difference between hive and spark?
What are benefits of Spark over MapReduce?
What is hadoop spark?
How to identify that the given operation is transformation or action?
Can rdd be shared between sparkcontexts?
Can you explain how to minimize data transfers while working with Spark?
Does spark use java?
Can you define pagerank?
Can you explain spark rdd?
What is spark shuffle?
What is Directed Acyclic Graph(DAG)?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)