Does spark require hdfs?
Answer / Amlish Kumar
No, Apache Spark does not strictly require HDFS (Hadoop Distributed File System). While it was originally designed to work with HDFS, Spark can now read and write data from a variety of data sources, including local file systems, S3, Cassandra, and more.
| Is This Answer Correct ? | 0 Yes | 0 No |
What do you understand by Pair RDD?
Explain a scenario where you will be using spark streaming.
List the functions of Spark SQL?
Does spark run hadoop?
What is Spark Dataset?
Does Apache Spark provide check pointing?
List some use cases where Spark outperforms Hadoop in processing.
What is spark certification?
What is the disadvantage of spark sql?
What does apache spark do?
What is the use of spark sql?
Explain the Parquet File format in Apache Spark. When is it the best to choose this?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)