Why is Transformation lazy in Spark?
Answer / Romi Awasthi
Transformations are lazy in Spark to optimize resource utilization. When a transformation is called, it doesn't immediately execute the action on all data; instead, it creates a logical plan outlining the steps required to process the data. The physical execution starts only when an action like collect(), count(), or save() is triggered.
| Is This Answer Correct ? | 0 Yes | 0 No |
Which is the best spark certification?
What is accumulators and broadcast variables in spark?
What do you mean by Persistence?
What is dag spark?
List various commonly used machine learning algorithm?
How can apache spark be used alongside hadoop?
What are the actions in spark?
How does spark rdd work?
What are the common transformations in apache spark?
How is Apache Spark better than Hadoop?
By Default, how many partitions are created in RDD in Apache Spark?
Explain Spark join() operation?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)