Answer Posted / Pooja Tripathi
No, Spark does not load all data into memory. Instead, it processes data in chunks called RDDs (Resilient Distributed Datasets) and caches frequently used data in memory only as needed.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers