Details
-
Bug
-
Status: Resolved
-
Minor
-
Resolution: Fixed
-
2.3.1
-
None
Description
I tried to add another test in the current suite which uses more than one argument and it fails:
+ CMD=("$SPARK_HOME/bin/spark-submit" --conf "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@")
+ exec /sbin/tini -s – /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=9.0.10.29 --deploy-mode client --properties-file /opt/spark/conf/spark.properties --class org.apache.spark.examples.DFSReadWriteTest spark-internal '/etc/resolv.conf hdfs:///test-SGzsB'
2018-06-29 15:31:51 WARN Utils:66 - Kubernetes master URL uses HTTP instead of HTTPS.
2018-06-29 15:31:52 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
args size: 1
Arg: /etc/resolv.conf hdfs:///test-SGzsB
DFS Read-Write Test
Usage: localFile dfsDir
localFile - (string) local file to use in test
Reason is this line here: https://github.com/apache/spark/blob/master/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/KubernetesTestComponents.scala#L109 which adds all args to one element in the final array. But Processbuilder will not split args later on:on: https://github.com/apache/spark/blob/f6e6899a8b8af99cd06e84cae7c69e0fc35bc60a/resource-managers/kubernetes/integration-tests/src/test/scala/org/apache/spark/deploy/k8s/integrationtest/ProcessUtils.scala#L32