Answer Posted / Sarover Singh
SparkContext is the entry point to Spark APIs. It provides access to a variety of distributed computing resources such as RDDs (Resilient Distributed Datasets), DataFrames, and Spark Streaming streams. SparkContext acts as a bridge between the local driver program and the cluster manager (such as Mesos, YARN, or standalone mode).
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers