Do I need to know hadoop to learn spark?
Answer / Km Karuna
Although Hadoop MapReduce is the original foundation for Spark, it's not required to have prior knowledge of Hadoop to learn Spark. Spark can be used independently.
| Is This Answer Correct ? | 0 Yes | 0 No |
What is a spark context?
explain the use of blinkdb?
What is the abstraction of Spark Streaming?
Explain about the core components of a distributed Spark application?
What is amazon spark?
What do you know about schemardd?
Explain key features of Spark
Why is spark used?
What is serialization in spark?
What is the disadvantage of spark sql?
What is the difference between cache and persist in spark?
What is a parquet file?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)