Do you have to introduce Spark on all hubs of YARN bunch?
Answer / Pranav Prakash
No, it's not necessary. You can launch a single ApplicationMaster (AM) on one node in the cluster and allow AM to schedule tasks across all nodes running as slaves.
| Is This Answer Correct ? | 0 Yes | 0 No |
What is pyspark used for?
How do I open pyspark shell in windows?
What are Accumulators?
Does pyspark work with python3?
Is pyspark a framework?
What is the connection between Job, Task, Stage ?
What is rdd in pyspark?
What is PageRank Algorithm?
What is GraphX?
What is the job of store() and continue()?
What is the upside of Spark apathetic assessment?
Notice a few Transformations and Actions?