Can rdd be shared between sparkcontexts?
Answer / Seema Saher
No, RDDs cannot be directly shared between different SparkContexts due to their distributed nature and isolation.
| Is This Answer Correct ? | 0 Yes | 0 No |
Is java required for spark?
Describe coalesce() operation. When can you coalesce to a larger number of partitions? Explain.
What is a dataframe spark?
What is difference between dataset and dataframe in spark?
Explain Spark Streaming with Socket?
How is rdd distributed?
What is executor memory in a spark application?
How spark works on hadoop?
Can rdd be shared between sparkcontexts?
What is difference between spark and scala?
Can spark be used without hadoop?
Does spark replace hadoop?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)