Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-1330

compute_classpath.sh has extra echo which prevents spark-class from working

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Blocker
    • Resolution: Fixed
    • 1.0.0
    • 1.0.0
    • Deploy
    • None

    Description

      if I just use spark-class to try to run an example on yarn it errors out because of the echo put in compute_classpath.sh under the hive assembly check:

      echo "Hive assembly found, including hive support. If this isn't desired run sbt hive/clean."

      This causes the classpath to look like:

      exec /home/y/share/yjava_jdk/java//bin/java -cp Hive assembly found, including hive support. If this isn't desired run sbt hive/clean.
      /home/user1/user1cs-spark/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop0.23.10.jar:/home/user1/user1cs-spark/conf:/home/user1/user1cs-spark/lib_managed/jars/datanucleus-api-jdo-3.2.1.jar:/home/user1/user1cs-spark/lib_managed/jars/datanucleus-core-3.2.2.jar:/home/user1/user1cs-spark/lib_managed/jars/datanucleus-rdbms-3.2.1.jar:/home/user1/user1cs-spark/sql/hive/target/scala-2.10//spark-hive-assembly-1.0.0-SNAPSHOT-hadoop0.23.10.jar:/home/user1/yarn_install//conf -Djava.library.path= -Xms512m -Xmx512m org.apache.spark.deploy.yarn.Client --jar examples/target/scala-2.10/spark-examples-assembly-1.0.0-SNAPSHOT.jar --class org.apache.spark.examples.SparkPi --args yarn-cluster

      Attachments

        Issue Links

          Activity

            People

              tgraves Thomas Graves
              tgraves Thomas Graves
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: