Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-1638

Executors fail to come up if "spark.executor.extraJavaOptions" is set

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • None
    • 1.0.0
    • Spark Core
    • None
    • Bring up a cluster in EC2 using spark-ec2 scripts

    Description

      If you try to launch a PySpark shell with "spark.executor.extraJavaOptions" set to "-XX:+UseCompressedOops -XX:+UseCompressedStrings -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps", the executors never come up on any of the workers.

      I see the following error in log file :

      Spark Executor Command: "/usr/lib/jvm/java/bin/java" "-cp" "/root/c3/lib/*::/root/ephemeral-hdfs/conf:/root/spark/conf:/root/spark/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop1.0.4.jar:" "-XX:+UseCompressedOops -XX:+UseCompressedStrings -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps" "-Xms13312M" "-Xmx13312M" "org.apache.spark.executor.CoarseGrainedExecutorBackend" "akka.tcp://spark@HOSTNAME:45429/user/CoarseGrainedScheduler" "7" "HOSTNAME" "4" "akka.tcp://sparkWorker@HOSTNAME:39727/user/Worker" "app-20140423224526-0000"
      ========================================

      Unrecognized VM option 'UseCompressedOops -XX:+UseCompressedStrings -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps'
      Error: Could not create the Java Virtual Machine.
      Error: A fatal exception has occurred. Program will exit.

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              kalpit Kalpit Shah
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: