Define various running modes of apache spark?
Answer / Dhiraj Singh
1. Local mode: Spark runs on your local machine using only one executor (driver node). 2. Cluster mode: Spark runs on a Hadoop cluster, where each worker node becomes an executor. 3. Standalone mode: Spark has its own resource manager and job scheduler instead of relying on YARN or Mesos.
| Is This Answer Correct ? | 0 Yes | 0 No |
What is lineage graph in Apache Spark?
What is RDD Lineage?
What does rdd mean?
Explain fold() operation in spark?
In how many ways RDDs can be created? Explain.
Why do we need sparkcontext?
Name types of Cluster Managers in Spark.
Can I install spark on windows?
What database does spark use?
Explain first() operation in Spark?
Define Spark-SQL?
Does Apache Spark provide checkpoints?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)