Answer Posted / Shiv Shakti Shankar
A Spark Job is a unit of work in Apache Spark. It consists of a Directed Acyclic Graph (DAG) of stages and tasks that process data stored in various data sources. The Spark Job runs on the cluster manager, such as Apache Hadoop's YARN or Mesos, and processes data in parallel across multiple nodes.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers