Failed to submit local jar to spark cluster: java.nio.file.NoSuchFileException Failed to submit local jar to spark cluster: java.nio.file.NoSuchFileException kubernetes kubernetes

Failed to submit local jar to spark cluster: java.nio.file.NoSuchFileException


Blockquote UPDATE

I think I understand a little more about the spark and why I had this problem and >spark-submit error: ClassNotFoundException. The key point is that though the word >REST used here REST URL: spark://127.0.1.1:6066 (cluster mode), the application >jar will not be uploaded to the cluster after submission, which is different with >my understanding. so, the spark cluster cannot find the application jar, and >cannot load the main class.

That's why you have to locate the jar-file in the master node OR put it into the hdfs before the spark submit.

This is how to do it:1.) Transfering the file to the master node with ubuntu command

$ scp <file> <username>@<IP address or hostname>:<Destination>

For example:

$ scp mytext.txt tom@128.140.133.124:~/

2.) Transfering the file to the HDFS:

$ hdfs dfs -put mytext.txt

Hope I could help you.


The standalone mode cluster wants to pass jar files to hdfs because the driver is on any node in the cluster.

hdfs dfs -put xxx.jar /user/spark-submit --master spark://xxx:7077 \--deploy-mode cluster \--supervise \--driver-memory 512m \--total-executor-cores 1 \--executor-memory 512m \--executor-cores 1 \--class com.xiyou.bi.streaming.game.common.DmMoGameviewOnlineLogic \hdfs://xxx:8020/user/hutao/xxx.jar