Answer Posted / Vivek Kumar Mishra
Accumulators in Apache Spark are variables that can be updated by tasks running on executors during the execution of an action. They provide a way to aggregate results across multiple tasks without the need for shuffling data. Accumulators can be used for various purposes, such as tracking the number of records processed or maintaining a moving average. Users can define their own accumulator classes using Scala or Java.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers