is it necessary to install Spark on all nodes while running Spark application on Yarn?
Answer / Amit Chauhan
It is not necessary to install Spark on all nodes while running a Spark application on YARN. The worker nodes (also known as DataNodes in Hadoop) will have the required resources and Spark installed.
| Is This Answer Correct ? | 0 Yes | 0 No |
In how many ways can we use Spark over Hadoop?
Explain the processing speed difference between Hadoop and Apache Spark?
What is spark lineage?
Why spark is used?
Name the two types of shared variable available in Apache Spark?
Can you explain how to minimize data transfers while working with Spark?
Why is the spark so fast?
What is spark deploy mode?
Compare Hadoop and Spark?
Define sparksession in apache spark? Why is it needed?
Is spark distributed computing?
What are the various libraries available on top of Apache Spark?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)