How DAG functions in Spark?
Answer / Pankaj Kumar Singh
Directed Acyclic Graph (DAG) in Spark represents the data flow between Spark tasks and transformations. It is a collection of vertices (tasks) connected by edges (dependencies). When a Spark job is submitted, it gets converted into a DAG where each vertex represents an action or transformation, and the edges represent the dependencies between them.
| Is This Answer Correct ? | 0 Yes | 0 No |
Notice a few Transformations and Actions?
What is GraphX?
What are activities and changes?
What is PageRank Algorithm?
How is pyspark different from python?
What is pyspark in python?
What is Sliding Window?
What is the upside of Spark apathetic assessment?
What is sparkcontext in pyspark?
What is parallelize in pyspark?
Explain the Apache Spark Architecture. How to Run Spark applications?
What is the job of store() and continue()?