Answer Posted / Rishabh Bajpai
"Spark Jobs" are units of work that you submit to a Spark cluster for processing. Each job is made up of one or more tasks, and each task is processed by a worker node in the cluster.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers