Answer Posted / Chhavi Singh
Accumulators in Spark allow you to compute and update a variable during the execution of an action. They can be used for iterative algorithms where you want to accumulate some value across each iteration. In PySpark, you can create an Accumulator using `acc = sc.accumulator(0)`. Then, update it inside map() or reduce() functions.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers