What is difference between scala and spark?
Answer / Dwijendra Kumar Upadhyay
Scala is a general-purpose programming language developed by Martin Odersky, while Spark is an open-source big data processing framework. Scala can be used to write programs for Spark, but it's not necessary.
| Is This Answer Correct ? | 0 Yes | 0 No |
Who created spark?
Can you mention some features of spark?
What are the types of Apache Spark transformation?
Can I learn spark without hadoop?
What is driver memory and executor memory in spark?
What is spark driver application?
Can you explain spark mllib?
What do you understand by the parquet file?
How many partitions are created by default in Apache Spark RDD?
List the various types of "Cluster Managers" in Spark.
Explain api create or replace tempview()?
Explain the level of parallelism in Spark Streaming? Also, describe its need.
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)