Answer Posted / Roushan Kumar Tiwary
Although Spark can work with HDFS (Hadoop Distributed File System), it is not a requirement. Spark can read and write data from various data sources like local file system, Cassandra, MongoDB, S3, etc.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers