What is sc parallelize?
Answer / Ajay Kant Singh
In Apache Spark, SparkContext (sc) provides the parallelize() function to create a distributed collection of data from local collections.
| Is This Answer Correct ? | 0 Yes | 0 No |
How is rdd fault?
How do I start a spark server?
Explain transformation and action in RDD in Apache Spark?
how can you identify whether a given operation is transformation or action?
What happens when we submit a spark job?
Can you do real-time processing with Spark SQL?
What apache spark is used for?
Do I need to install hadoop for spark?
Define parquet file format? How to convert data to parquet format?
How do sparks work?
What is spark database?
What is rdd map?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)