What should be the HDFS Block size to get maximum performance from Hadoop cluster?
Answer Posted / Sachin Babu Varshney
The optimal block size for getting maximum performance from a Hadoop cluster depends on various factors such as network bandwidth, storage capacity, and the nature of data being processed. However, a common recommendation is to set the block size between 64MB and 512MB. Experimentation is often required to find the best block size for a specific use case.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
No New Questions to Answer in this Category !! You can
Post New Questions
Answer Questions in Different Category