Answer Posted / Gurudatta Vashishtha
Some common tools used in Big Data include Apache Hadoop for distributed processing, Apache Spark for fast and general-purpose computation, Apache Pig and Apache Hive for data warehousing, and Apache Flume and Apache Kafka for real-time data streaming.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers