In what ways sparksession different from sparkcontext?
Answer / Akash Tripathi
SparkSession and SparkContext are both entry points to Spark APIs, but there are some key differences between them: (1) SparkSession is the recommended entry point for new applications in Apache Spark 2.0 or later, as it manages the lifecycle of the session and provides a unified interface for configuring settings across various Spark APIs; (2) SparkContext is an older API that allows fine-grained control over configuration settings but requires manual management of the application's lifecycle.
| Is This Answer Correct ? | 0 Yes | 0 No |
What is Spark Core?
What languages support spark?
What is the bottom layer of abstraction in the Spark Streaming API ?
What is stage and task in spark?
How do I get apache spark on windows 10?
Does hadoop install spark?
Why is transformation lazy operation in Apache Spark RDD? How is it useful?
What are the features of RDD, that makes RDD an important abstraction of Spark?
Why do fires spark?
What is apache spark used for?
Is spark secure?
Can we broadcast an rdd?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)