Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-4556

Document that make-distribution.sh is required to make a runnable distribution

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • None
    • 1.4.0
    • Build, Deploy, Documentation
    • None

    Description

      After building the binary distribution assembly, the resultant tarball can't be used for local mode.

      busbey2-MBA:spark busbey$ mvn -Pbigtop-dist -DskipTests=true package
      [INFO] Scanning for projects...
      ...SNIP...
      [INFO] ------------------------------------------------------------------------
      [INFO] Reactor Summary:
      [INFO] 
      [INFO] Spark Project Parent POM ........................... SUCCESS [ 32.227 s]
      [INFO] Spark Project Networking ........................... SUCCESS [ 31.402 s]
      [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [  8.864 s]
      [INFO] Spark Project Core ................................. SUCCESS [15:39 min]
      [INFO] Spark Project Bagel ................................ SUCCESS [ 29.470 s]
      [INFO] Spark Project GraphX ............................... SUCCESS [05:20 min]
      [INFO] Spark Project Streaming ............................ SUCCESS [11:02 min]
      [INFO] Spark Project Catalyst ............................. SUCCESS [11:26 min]
      [INFO] Spark Project SQL .................................. SUCCESS [11:33 min]
      [INFO] Spark Project ML Library ........................... SUCCESS [14:27 min]
      [INFO] Spark Project Tools ................................ SUCCESS [ 40.980 s]
      [INFO] Spark Project Hive ................................. SUCCESS [11:45 min]
      [INFO] Spark Project REPL ................................. SUCCESS [03:15 min]
      [INFO] Spark Project Assembly ............................. SUCCESS [04:22 min]
      [INFO] Spark Project External Twitter ..................... SUCCESS [ 43.567 s]
      [INFO] Spark Project External Flume Sink .................. SUCCESS [ 50.367 s]
      [INFO] Spark Project External Flume ....................... SUCCESS [01:41 min]
      [INFO] Spark Project External MQTT ........................ SUCCESS [ 40.973 s]
      [INFO] Spark Project External ZeroMQ ...................... SUCCESS [ 54.878 s]
      [INFO] Spark Project External Kafka ....................... SUCCESS [01:23 min]
      [INFO] Spark Project Examples ............................. SUCCESS [10:19 min]
      [INFO] ------------------------------------------------------------------------
      [INFO] BUILD SUCCESS
      [INFO] ------------------------------------------------------------------------
      [INFO] Total time: 01:47 h
      [INFO] Finished at: 2014-11-22T02:13:51-06:00
      [INFO] Final Memory: 79M/2759M
      [INFO] ------------------------------------------------------------------------
      busbey2-MBA:spark busbey$ cd assembly/target/
      busbey2-MBA:target busbey$ mkdir dist-temp
      busbey2-MBA:target busbey$ tar -C dist-temp -xzf spark-assembly_2.10-1.3.0-SNAPSHOT-dist.tar.gz 
      busbey2-MBA:target busbey$ cd dist-temp/
      busbey2-MBA:dist-temp busbey$ ./bin/spark-shell
      ls: /Users/busbey/projects/spark/assembly/target/dist-temp/assembly/target/scala-2.10: No such file or directory
      Failed to find Spark assembly in /Users/busbey/projects/spark/assembly/target/dist-temp/assembly/target/scala-2.10
      You need to build Spark before running this program.
      

      It looks like the classpath calculations in bin/compute_classpath.sh don't handle it.

      If I move all of the spark-*.jar files from the top level into the lib folder and touch the RELEASE file, then the spark shell launches in local mode normally.

      Attachments

        Activity

          People

            srowen Sean R. Owen
            busbey Sean Busbey
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: