Golgappa.net | Golgappa.org | BagIndia.net | BodyIndia.Com | CabIndia.net | CarsBikes.net | CarsBikes.org | CashIndia.net | ConsumerIndia.net | CookingIndia.net | DataIndia.net | DealIndia.net | EmailIndia.net | FirstTablet.com | FirstTourist.com | ForsaleIndia.net | IndiaBody.Com | IndiaCab.net | IndiaCash.net | IndiaModel.net | KidForum.net | OfficeIndia.net | PaysIndia.com | RestaurantIndia.net | RestaurantsIndia.net | SaleForum.net | SellForum.net | SoldIndia.com | StarIndia.net | TomatoCab.com | TomatoCabs.com | TownIndia.com
Interested to Buy Any Domain ? << Click Here >> for more details...


When running Spark applications, is it important to introduce Spark on every one of the hubs of YARN group?



When running Spark applications, is it important to introduce Spark on every one of the hubs of YARN..

Answer / Chhavi Sagar

No, it's not necessary to install Spark on every node in a YARN cluster when running Spark applications. Instead, you should have at least one node (called the Spark Master) with the Spark application client and the ResourceManager component, as well as several nodes (called Spark Workers or Executors) that run the Spark Executor processes.

Is This Answer Correct ?    0 Yes 0 No

Post New Answer

More PySpark Interview Questions

How is Spark SQL not the same as HQL and SQL?

1 Answers  


What is YARN?

1 Answers  


Does pyspark install spark?

1 Answers  


What is the job of store() and continue()?

1 Answers  


What is Sliding Window?

1 Answers  


What is the contrast between RDD, DataFrame and DataSets?

1 Answers  


What is the upside of Spark apathetic assessment?

1 Answers  


How is Streaming executed in Spark? Clarify with precedents.

1 Answers  


What is PageRank Algorithm?

1 Answers  


How DAG functions in Spark?

1 Answers  


Explain the Apache Spark Architecture. How to Run Spark applications?

1 Answers  


How might you associate Hive to Spark SQL?

0 Answers  


Categories