What is the default partition in spark?
Answer Posted / Ajay Singh
The default number of partitions in Spark is 200, as specified by the spark.default.parallelism configuration property.
Post New Answer View All Answers
What is the latest version of spark?
288
Explain how RDDs work with Scala in Spark
355
What is meant by Transformation? Give some examples.
328
List the advantage of Parquet file in Apache Spark?
474