Answer Posted / Nimmi Gupta
HDFS was designed as part of the Apache Hadoop project to support data-intensive applications that require scalability, fault tolerance, and easy management. It allows for storing large amounts of data across a cluster of commodity servers, providing high throughput read and write operations.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
No New Questions to Answer in this Category !! You can
Post New Questions
Answer Questions in Different Category