Details
-
Bug
-
Status: Resolved
-
Minor
-
Resolution: Fixed
-
2.3.0
-
None
Description
In the current Spark on YARN code, AM always will copy and overwrite its env variables to executors, so we cannot set different values for executors.
To reproduce issue, user could start spark-shell like:
./bin/spark-shell --master yarn-client --conf spark.executorEnv.SPARK_ABC=executor_val --conf spark.yarn.appMasterEnv.SPARK_ABC=am_val
Then check executor env variables by
sc.parallelize(1 to 1).flatMap \{ i => sys.env.toSeq }.collect.foreach(println)
You will always get am_val instead of executor_val. So we should not let AM to overwrite specifically set executor env variables.