How do I start a spark master?
Answer / Poonam Shreshtha
To start a Spark Master, you can use the following command (assuming Spark is installed and added to your PATH):nn`spark-master --host <your-host> --port <your-port>`
| Is This Answer Correct ? | 0 Yes | 0 No |
Explain the action count() in Spark RDD?
What is application master in spark?
How apache spark works?
Why do we use persist () on links rdd?
Is spark written in scala?
How do you stop a spark?
Define functions of SparkCore?
How can I improve my spark performance?
What do you understand about yarn?
How do you integrate spark and hive?
What is an accumulator in spark?
Why is Spark RDD immutable?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)