Answer Posted / Priyanka Pandey
Apache Spark's runtime architecture is composed of three main components: (1) Spark Driver Program, (2) Executor Services, and (3) Hadoop YARN or Standalone Mode. The Spark Driver Program initiates the application, creates an RDD (Resilient Distributed Dataset), schedules tasks, and monitors the progress of the tasks. Executor Services run on worker nodes. They execute the tasks assigned by the driver program, manage memory, and communicate with other executors for resilience. Hadoop YARN or Standalone Mode manages resources for Spark applications.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers