Answer Posted / Himanshu Kataria
A SparkContext is a top-level entry point for using Apache Spark. It provides access to Spark's cluster manager, as well as various APIs for creating RDDs, running actions and transformations, managing the execution environment, and managing checkpointing.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers