Golgappa.net | Golgappa.org | BagIndia.net | BodyIndia.Com | CabIndia.net | CarsBikes.net | CarsBikes.org | CashIndia.net | ConsumerIndia.net | CookingIndia.net | DataIndia.net | DealIndia.net | EmailIndia.net | FirstTablet.com | FirstTourist.com | ForsaleIndia.net | IndiaBody.Com | IndiaCab.net | IndiaCash.net | IndiaModel.net | KidForum.net | OfficeIndia.net | PaysIndia.com | RestaurantIndia.net | RestaurantsIndia.net | SaleForum.net | SellForum.net | SoldIndia.com | StarIndia.net | TomatoCab.com | TomatoCabs.com | TownIndia.com
Interested to Buy Any Domain ? << Click Here >> for more details...


How would you determine the quantity of parcels while making a RDD? What are the capacities?



How would you determine the quantity of parcels while making a RDD? What are the capacities?..

Answer / Ms Varsha

In PySpark, the number of parcels in a Resilient Distributed Dataset (RDD) can be determined by using actions like count(). The capacity of each parcel can be defined during the creation of RDD, for example, when reading data from a file or a database. Here is an example of creating an RDD with predefined capacities:

rdd = sc.textFile("data.txt", 4)

In this example, the textFile function takes two arguments - the path to the data and the number of splits (capacities).

Is This Answer Correct ?    0 Yes 0 No

Post New Answer

More PySpark Interview Questions

What are the different dimensions of constancy in Apache Spark?

1 Answers  


What is parallelize in pyspark?

1 Answers  


What is sparkcontext in pyspark?

1 Answers  


What is YARN?

1 Answers  


What is Pyspark?

1 Answers  


Name the parts of Spark Ecosystem?

1 Answers  


What is map in pyspark?

1 Answers  


What are the enhancements that engineer can make while working with flash?

1 Answers  


Explain the key highlights of Apache Spark?

1 Answers  


What is DStream?

1 Answers  


What is PageRank Algorithm?

1 Answers  


Explain about the parts of Spark Architecture?

1 Answers  


Categories