Why is transformation lazy operation in Apache Spark RDD? How is it useful?
Answer Posted / Rahul Bajpai
Transformations in Apache Spark are lazy because they only execute when an action (like collect(), count(), or save()) is triggered. This allows for efficient task scheduling, optimization, and caching of intermediate results to improve performance for large datasets.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers