What are the two ways to create rdd in spark?
Answer / Raman Kumar Raushan
The two main ways to create RDDs in Spark are through text files (using the textFile method) and through parallel collection (using the parallelize method).
| Is This Answer Correct ? | 0 Yes | 0 No |
How to process data using Transformation operation in Spark?
What is the difference between python and spark?
Should I install spark on all nodes of yarn cluster?
What rdd stands for?
What database does spark use?
Does Apache Spark provide checkpoints?
How to create a Sparse vector from a dense vector?
Which is the best spark certification?
What are the components of Apache Spark Ecosystem?
How spark is faster than hadoop?
Why was spark created?
What are the exact differences between reduce and fold operation in Spark?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)