Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Duplicate
-
None
-
None
-
Bring up a cluster in EC2 using spark-ec2 scripts
Description
If you try to launch a PySpark shell with "spark.executor.extraJavaOptions" set to "-XX:+UseCompressedOops -XX:+UseCompressedStrings -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps", the executors never come up on any of the workers.
I see the following error in log file :
Spark Executor Command: "/usr/lib/jvm/java/bin/java" "-cp" "/root/c3/lib/*::/root/ephemeral-hdfs/conf:/root/spark/conf:/root/spark/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar:" "-XX:+UseCompressedOops -XX:+UseCompressedStrings -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps" "-Xms13312M" "-Xmx13312M" "org.apache.spark.executor.CoarseGrainedExecutorBackend" "akka.tcp://spark@HOSTNAME:45429/user/CoarseGrainedScheduler" "7" "HOSTNAME" "4" "akka.tcp://sparkWorker@HOSTNAME:39727/user/Worker" "app-20140423224526-0000"
========================================
Unrecognized VM option 'UseCompressedOops -XX:+UseCompressedStrings -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps'
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
Attachments
Issue Links
- duplicates
-
SPARK-1609 Executor fails to start when Command.extraJavaOptions contains multiple Java options
- Resolved