Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
4.0.0
Description
[info] - PVs with local hostpath storage on statefulsets *** FAILED *** (3 minutes, 11 seconds) 3786[info] The code passed to eventually never returned normally. Attempted 7921 times over 3.0001059888166663 minutes. Last failure message: "++ id -u 3787[info] + myuid=185 3788[info] ++ id -g 3789[info] + mygid=0 3790[info] + set +e 3791[info] ++ getent passwd 185 3792[info] + uidentry= 3793[info] + set -e 3794[info] + '[' -z '' ']' 3795[info] + '[' -w /etc/passwd ']' 3796[info] + echo '185:x:185:0:anonymous uid:/opt/spark:/bin/false' 3797[info] + '[' -z /opt/java/openjdk ']' 3798[info] + SPARK_CLASSPATH=':/opt/spark/jars/*' 3799[info] + grep SPARK_JAVA_OPT_ 3800[info] + sort -t_ -k4 -n 3801[info] + sed 's/[^=]*=\(.*\)/\1/g' 3802[info] + env 3803[info] ++ command -v readarray 3804[info] + '[' readarray ']' 3805[info] + readarray -t SPARK_EXECUTOR_JAVA_OPTS 3806[info] + '[' -n '' ']' 3807[info] + '[' -z ']' 3808[info] + '[' -z ']' 3809[info] + '[' -n '' ']' 3810[info] + '[' -z ']' 3811[info] + '[' -z x ']' 3812[info] + SPARK_CLASSPATH='/opt/spark/conf::/opt/spark/jars/*' 3813[info] + SPARK_CLASSPATH='/opt/spark/conf::/opt/spark/jars/*:/opt/spark/work-dir' 3814[info] + case "$1" in 3815[info] + shift 1 3816[info] + CMD=("$SPARK_HOME/bin/spark-submit" --conf "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --conf "spark.executorEnv.SPARK_DRIVER_POD_IP=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@") 3817[info] + exec /usr/bin/tini -s -- /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=10.244.0.45 --conf spark.executorEnv.SPARK_DRIVER_POD_IP=10.244.0.45 --deploy-mode client --properties-file /opt/spark/conf/spark.properties --class org.apache.spark.examples.MiniReadWriteTest local:///opt/spark/examples/jars/spark-examples_2.12-4.0.0-SNAPSHOT.jar /opt/spark/pv-tests/tmp3727659354473892032.txt 3818[info] Files local:///opt/spark/examples/jars/spark-examples_2.12-4.0.0-SNAPSHOT.jar from /opt/spark/examples/jars/spark-examples_2.12-4.0.0-SNAPSHOT.jar to /opt/spark/work-dir/spark-examples_2.12-4.0.0-SNAPSHOT.jar 3819[info] 23/07/20 06:15:15 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 3820[info] Performing local word count from /opt/spark/pv-tests/tmp3727659354473892032.txt 3821[info] File contents are List(test PVs) 3822[info] Creating SparkSession 3823[info] 23/07/20 06:15:15 INFO SparkContext: Running Spark version 4.0.0-SNAPSHOT 3824[info] 23/07/20 06:15:15 INFO SparkContext: OS info Linux, 5.15.0-1041-azure, amd64 3825[info] 23/07/20 06:15:15 INFO SparkContext: Java version 17.0.7 3826[info] 23/07/20 06:15:15 INFO ResourceUtils: ============================================================== 3827[info] 23/07/20 06:15:15 INFO ResourceUtils: No custom resources configured for spark.driver. 3828[info] 23/07/20 06:15:15 INFO ResourceUtils: ============================================================== 3829[info] 23/07/20 06:15:15 INFO SparkContext: Submitted application: Mini Read Write Test 3830[info] 23/07/20 06:15:16 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
The tests in the past two days have failed
Attachments
Attachments
Issue Links
- is related to
-
SPARK-44495 Use the latest minikube in K8s IT
- Resolved
- links to