Answer Posted / Shobhit Rastogi
A cluster in Apache Spark refers to a collection of machines that work together as a single system. These machines, or nodes, are managed by a cluster manager (such as Apache Mesos, YARN, or Standalone), allowing for parallel computation and data processing.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers