What are tools available to send the streaming data to hdfs?
Answer / Rohit Kumar Bhadani
Tools like Apache Flume, Apache Kafka with HDFS sink connector, Apache NiFi, and Apache Spark can be used to send streaming data to HDFS.
| Is This Answer Correct ? | 0 Yes | 0 No |
Characterize data integrity? How does hdfs ensure information integrity of data blocks squares kept in hdfs?
Explain about the indexing process in hdfs?
What do you mean by the High Availability of a NameNode in Hadoop HDFS?
How does HDFS Index Data blocks? Explain.
What is the difference betwaeen mapreduce engine and hdfs cluster?
How to create directory in HDFS?
Why is block size large in Hadoop?
What is the procedure to create users in HDFS and how to allocate Quota to them?
Is the hdfs block size reduced to achieve faster query results?
Replication causes data redundancy and consume a lot of space, then why is it pursued in hdfs?
What is a block in HDFS? what is the default size in Hadoop 1 and Hadoop 2? Can we change the block size?
Why does hive not store metadata information in hdfs?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)