Can I learn spark without hadoop?
Answer / Alok Kumar Agmani
Yes, you can learn Spark without having prior knowledge of Hadoop. Although Spark was initially built to run on top of Hadoop's YARN resource manager, it can also be used independently with standalone or Mesos cluster managers.
| Is This Answer Correct ? | 0 Yes | 0 No |
What are the roles and responsibilities of worker nodes in the Apache Spark cluster? Is Worker Node in Spark is same as Slave Node?
Does spark need hadoop?
What is an rdd?
Explain how can spark be connected to apache mesos?
Explain pipe() operation in Apache Spark?
Can you explain about the cluster manager of apache spark?
Please explain the sparse vector in Spark.
Please provide an explanation on DStream in Spark.
How does pipe operation writes the result to standard output in Apache Spark?
Does spark load all data in memory?
What happens to rdd when one of the nodes on which it is distributed goes down?
What is heap memory in spark?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)