Answer Posted / Rao Yaduman
Apache Spark's architecture consists of the Spark Master (or Spark Context), Spark Workers, and Resilient Distributed Datasets (RDDs). The Master manages resources, schedules tasks, and assigns them to workers. Workers execute tasks and manage RDD partitions.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers