Answer Posted / Yogendra Pal Singh
In Apache Spark, a "Stage" represents a collection of tasks that process the same dataset. A stage can be thought of as a unit of work that can run on one worker node at a time. On the other hand, a "Task" is the smallest unit of work that gets executed by workers in Spark. Each task performs some operation on a subset of data.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers