What do you understand by Lazy Evaluation?
Answer / Km Neelam Saini
Lazy Evaluation in Apache Spark means that transformations on RDDs are not executed until an action (such as count, collect, save) is triggered. This allows for efficient resource utilization and enables chaining multiple operations without immediately executing them.
| Is This Answer Correct ? | 0 Yes | 0 No |
Compare MapReduce and Spark?
What is spark in big data?
how will you implement SQL in Spark?
Is spark better than hadoop?
What are broadcast variables in Apache Spark? Why do we need them?
What is apache spark and what is it used for?
What is spark certification?
Where is spark rdd?
How is streaming implemented in spark?
What is a "Spark Executor"?
What are accumulators in Apache Spark?
What is salting in spark?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)