How does lazy evaluation work in spark?
Answer / Kiran Chhabra
In Apache Spark, transformations are executed lazily. This means that the operations you perform on an RDD or DataFrame (e.g., map, filter) are not executed immediately but rather only when an action is called (e.g., count, saveAsTextFile).
| Is This Answer Correct ? | 0 Yes | 0 No |
What rdd stands for?
Name some internal daemons used in spark?
What are the types of Transformation in Spark RDD Operations?
Do I need to install hadoop for spark?
Can you explain benefits of spark over mapreduce?
How will you connect Apache Spark with Apache Mesos?
Explain the terms Spark Partitions and Partitioners?
Explain apache spark streaming? How is the processing of streaming data achieved in apache spark?
What is sc parallelize?
What are the features of spark rdd?
Explain schemardd?
What is spark code?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)