Starting an Apache Spark cluster Starting an Apache Spark cluster hadoop hadoop

Starting an Apache Spark cluster


Starting the Server:

To start a standalone master server execute:

$ ./sbin/start-master.sh

To start one or more workers and connect them to the master via:

$ ./sbin/start-slave.sh <master-spark-URL>

Take a look at this article: Apache Spark Cluster Installation and Configuration Guide


look likes the /opt/spark folder does not have permission to writing into log file give the full permission :- sudo chmod -R 777 /opt/sparkand also check that /opt/spark folder user:group have same value for all the machine (master and slaves both) otherwise run this command sudo chown -R userName:groupName /opt/spark