Why do we need sparkcontext?
Answer / Sarover Singh
SparkContext is the entry point to Spark APIs. It provides access to a variety of distributed computing resources such as RDDs (Resilient Distributed Datasets), DataFrames, and Spark Streaming streams. SparkContext acts as a bridge between the local driver program and the cluster manager (such as Mesos, YARN, or standalone mode).
| Is This Answer Correct ? | 0 Yes | 0 No |
Why do fires spark?
What is difference between map and flatmap in spark?
Does Hoe Spark handle monitoring and logging in Standalone mode?
What are the features of spark rdd?
Can you do real-time processing with Spark SQL?
What is meant by Transformation? Give some examples.
What are the various levels of persistence in Apache Spark?
How is dag created in spark?
Can you define pagerank?
What is the difference between scala and spark?
Explain different transformations in DStream in Apache Spark Streaming?
How does spark rdd work?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)