Answer Posted / Amlish Kumar
No, Apache Spark does not strictly require HDFS (Hadoop Distributed File System). While it was originally designed to work with HDFS, Spark can now read and write data from a variety of data sources, including local file systems, S3, Cassandra, and more.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers