What is spark architecture?
Answer / Rahul Kumar Nailwal
The Spark Architecture consists of the following components: 1) Spark Driver, 2) Executor, 3) Directed Acyclic Graph (DAG) Scheduler, and 4) Resilient Distributed Datasets (RDDs).
| Is This Answer Correct ? | 0 Yes | 0 No |
Do we need to install scala for spark?
Explain the use of File system API in Apache Spark
What is the role of Driver program in Spark Application?
Is spark part of hadoop ecosystem?
What is an "Accumulator"?
Where is spark rdd?
What is shark?
Does diesel engine have spark plug?
What are the optimization techniques in spark?
What are the downsides of Spark?
How can you minimize data transfers when working with Spark?
What is hadoop spark?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)