Status: In Progress
Affects Version/s: 1.6.0
Fix Version/s: None
I have 3 node cluster:namenode second and data1;
I use this shell to submit job on namenode:
bin/spark-submit --deploy-mode cluster --class com.bjdv.spark.job.Abc --total-executor-cores 5 --master spark://namenode:6066 hdfs://namenode:9000/sparkjars/spark.jar
The Driver may be started on the other node such as data1.
The problem is :
when I set SPARK_LOCAL_IP in conf/spark-env.sh on namenode
the driver will be started with this param such as
but the driver will start at data1,
the dirver will try to binding the ip 'namenode' on data1.
so driver will throw exception like this:
Service 'Driver' failed after 16 retries!