What is the use of checkpoints in spark?
Answer / Alok Kumar Srivastav
Checkpoints in Apache Spark are used to periodically save the state of a computation, ensuring that if a failure occurs, the application can be recovered from the last saved state. This helps improve fault tolerance and speed up recovery time by reducing the amount of work that needs to be recomputed after a failure.
| Is This Answer Correct ? | 0 Yes | 0 No |
What is Apache Spark and what are the benefits of Spark over MapReduce?
Does spark run mapreduce?
Can you define rdd lineage?
Does spark use java?
What are the features and characteristics of Apache Spark?
Explain what are the various types of Transformation on DStream?
What is Apache Spark Machine learning library?
What is master node in spark?
Who created spark?
Why Apache Spark?
What is graphx spark?
Please provide an explanation on DStream in Spark.
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)