Answer Posted / Raj Deep
Spark is a general-purpose distributed computing system written in Scala that provides an interface for programming large-scale data processing applications. On the other hand, PySpark is an API for Spark that allows developers to write programs using Python instead of Scala. In essence, PySpark is a wrapper over Spark and provides additional convenience when working with Python.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers