Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-15542

Make error message clear for script './R/install-dev.sh' when R is missing on Mac

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 2.0.0
    • 2.0.0
    • SparkR
    • None
    • Mac OS EI Captain

    Description

      I followed instructions here https://github.com/apache/spark/tree/master/R to build sparkR project. When running

      build/mvn -DskipTests -Psparkr package

      then I got error below:

      [INFO] ------------------------------------------------------------------------
      [INFO] Reactor Summary:
      [INFO]
      [INFO] Spark Project Parent POM ........................... SUCCESS [ 23.589 s]
      [INFO] Spark Project Tags ................................. SUCCESS [ 19.389 s]
      #!/bin/bash
      [INFO] Spark Project Sketch ............................... SUCCESS [  6.386 s]
      [INFO] Spark Project Networking ........................... SUCCESS [ 12.296 s]
      [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [  7.817 s]
      [INFO] Spark Project Unsafe ............................... SUCCESS [ 10.825 s]
      [INFO] Spark Project Launcher ............................. SUCCESS [ 12.262 s]
      [INFO] Spark Project Core ................................. FAILURE [01:40 min]
      [INFO] Spark Project GraphX ............................... SKIPPED
      [INFO] Spark Project Streaming ............................ SKIPPED
      [INFO] Spark Project Catalyst ............................. SKIPPED
      [INFO] Spark Project SQL .................................. SKIPPED
      [INFO] Spark Project ML Local Library ..................... SKIPPED
      [INFO] Spark Project ML Library ........................... SKIPPED
      [INFO] Spark Project Tools ................................ SKIPPED
      [INFO] Spark Project Hive ................................. SKIPPED
      [INFO] Spark Project REPL ................................. SKIPPED
      [INFO] Spark Project Assembly ............................. SKIPPED
      [INFO] Spark Project External Flume Sink .................. SKIPPED
      [INFO] Spark Project External Flume ....................... SKIPPED
      [INFO] Spark Project External Flume Assembly .............. SKIPPED
      [INFO] Spark Integration for Kafka 0.8 .................... SKIPPED
      [INFO] Spark Project Examples ............................. SKIPPED
      [INFO] Spark Project External Kafka Assembly .............. SKIPPED
      [INFO] Spark Project Java 8 Tests ......................... SKIPPED
      [INFO] ------------------------------------------------------------------------
      [INFO] BUILD FAILURE
      #!/bin/bash
      [INFO] ------------------------------------------------------------------------
      [INFO] Total time: 03:14 min
      [INFO] Finished at: 2016-05-25T21:51:58+00:00
      [INFO] Final Memory: 55M/782M
      [INFO] ------------------------------------------------------------------------
      [ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.4.0:exec (sparkr-pkg) on project spark-core_2.11: Command execution failed. Process exited with an error: 1 (Exit value: 1) -> [Help 1]
      [ERROR]
      [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
      [ERROR] Re-run Maven using the -X switch to enable full debug logging.
      [ERROR]
      [ERROR] For more information about the errors and possible solutions, please read the following articles:
      [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
      [ERROR]
      [ERROR] After correcting the problems, you can resume the build with the command
      [ERROR]   mvn <goals> -rf :spark-core_2.11
      

      and this error turned to be caused by

      ./R/install-dev.sh

      then I directly run this install-dev.sh script, and got

      mbp185-xr:spark xin$ ./R/install-dev.sh
      usage: dirname path
      

      This message is very confusing to me, and then I found R is not properly configured on my Mac when this script is using

      $(which R)

      to get R home.

      I tried similar situation on CentOS with R missing, and it's giving me very clear error message while MacOS is not.

      on CentOS:

      [root@ip-xxx-31-9-xx spark]# which R
      /usr/bin/which: no R in (/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/lib/jvm/java-1.7.0-openjdk.x86_64/bin:/root/bin)

      but on Mac, if not found then nothing returned and this is causing the confusing message for R build failure and running R/install-dev.sh:

      mbp185-xr:spark xin$ which R
      mbp185-xr:spark xin$
       

      So a more clear message needed for this miss configuration for R when running R/install-dev.sh.

      Attachments

        Activity

          People

            iamshrek Xin Ren
            iamshrek Xin Ren
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: