Define HDFS and talk about their respective components?
Answer Posted / Mohd Shikoh
HDFS (Hadoop Distributed File System) is a distributed file system used for storing large data sets across multiple commodity servers in a Hadoop cluster. It is designed to be highly fault-tolerant, scalable, and efficient in handling large volumes of data. The main components of HDFS are: NameNode (the central management component), DataNodes (which store the actual data blocks), and SecondaryNameNode (a background process that helps maintain the NameNode's metadata).
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
No New Questions to Answer in this Category !! You can
Post New Questions
Answer Questions in Different Category