Uploaded image for project: 'Bigtop'
  1. Bigtop
  2. BIGTOP-2654

spark 2.1 binaries need either SPARK_HOME or non existing find-spark-home exe

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 1.1.0
    • Fix Version/s: 1.2.0
    • Component/s: None
    • Labels:
      None

      Description

      spark-shell and other executables need either the SPARK_HOME or the find-spark-home executable.

      The find-spark-home is not packaged (what makes sense, since we use hardcoded /usr/lib/spark for packaging spark.

      The executable does not run without either the environment variable SPARK_HOME.

      I prefer not to have a puppet script to create a /etc/profile.d script to fix the situation.

      I tend to patch the executables instead, to have SPARK_HOME set within the executable: (like spark-env.sh)

      export SPARK_HOME=${SPARK_HOME:-/usr/lib/spark}
      

      Comments?

      Or do I miss something important (Maybe a debian specific problem?)

        Activity

        Hide
        asanjar Amir Sanjar added a comment -

        hm, I have not seen this problem on Ubuntu 16.04. Could you reproduce the problem while executing pyspark or running a job using spark-summit?

        Show
        asanjar Amir Sanjar added a comment - hm, I have not seen this problem on Ubuntu 16.04. Could you reproduce the problem while executing pyspark or running a job using spark-summit?
        Hide
        oflebbe Olaf Flebbe added a comment -

        Hi Amir Sanjar: both commands.

        Have a look at the first lines in spark-submit:

        if [ -z "${SPARK_HOME}" ]; then
          source "$(dirname "$0")"/find-spark-home
        fi
        

        Do you have SPARK_HOME set or do you have find-spark-home on your disk ?

        Show
        oflebbe Olaf Flebbe added a comment - Hi Amir Sanjar : both commands. Have a look at the first lines in spark-submit : if [ -z "${SPARK_HOME}" ]; then source "$(dirname " $0 ")" /find-spark-home fi Do you have SPARK_HOME set or do you have find-spark-home on your disk ?
        Hide
        asanjar Amir Sanjar added a comment -

        nighter one is set..

        root@64735753aa17:/bigtop# find / -name find-spark-home <<=== no find-spark-home
        root@64735753aa17:/bigtop# echo $SPARK_HOME <<=== SPARK_HOME not set

        root@64735753aa17:/bigtop# spark-shell
        Setting default log level to "WARN".
        To adjust logging level use sc.setLogLevel(newLevel).
        17/01/03 11:31:10 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
        17/01/03 11:31:11 WARN spark.SparkContext: Use an existing SparkContext, some configuration may not take effect.
        Spark context Web UI available at http://172.17.0.2:4040
        Spark context available as 'sc' (master = local[*], app id = local-1483443071242).
        Spark session available as 'spark'.
        Welcome to
        ____ __
        / _/_ ___ ____/ /_
        \ \/ _ \/ _ `/ __/ '/
        /__/ ./_,// //_\ version 2.0.2
        /_/

        Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_111)
        Type in expressions to have them evaluated.
        Type :help for more information.

        scala>

        Show
        asanjar Amir Sanjar added a comment - nighter one is set.. root@64735753aa17:/bigtop# find / -name find-spark-home <<=== no find-spark-home root@64735753aa17:/bigtop# echo $SPARK_HOME <<=== SPARK_HOME not set root@64735753aa17:/bigtop# spark-shell Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). 17/01/03 11:31:10 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/01/03 11:31:11 WARN spark.SparkContext: Use an existing SparkContext, some configuration may not take effect. Spark context Web UI available at http://172.17.0.2:4040 Spark context available as 'sc' (master = local [*] , app id = local-1483443071242). Spark session available as 'spark'. Welcome to ____ __ / _ / _ ___ ____ / / _ \ \/ _ \/ _ `/ __/ ' / /__ / . /_, / / / /_\ version 2.0.2 /_/ Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_111) Type in expressions to have them evaluated. Type :help for more information. scala>
        Hide
        asanjar Amir Sanjar added a comment - - edited

        oh wait..
        spark-env.sh has by default "export SPARK_HOME=$

        {SPARK_HOME:-/usr/lib/spark}

        "

        Show
        asanjar Amir Sanjar added a comment - - edited oh wait.. spark-env.sh has by default "export SPARK_HOME=$ {SPARK_HOME:-/usr/lib/spark} "
        Hide
        oflebbe Olaf Flebbe added a comment - - edited

        Amir Sanjar : Wait. Why does your observation matter? spark-env.sh is not sourced by anything.

        Please tell me the output of type spark-shell Is it a command, alias or something ?

        Please verify package dpkg -V spark-core , did you/someone else patched the command after install ?

        Please run bash -xv /usr/lib/spark/bin/spark-shell and tell me the output.

        Show
        oflebbe Olaf Flebbe added a comment - - edited Amir Sanjar : Wait. Why does your observation matter? spark-env.sh is not sourced by anything. Please tell me the output of type spark-shell Is it a command, alias or something ? Please verify package dpkg -V spark-core , did you/someone else patched the command after install ? Please run bash -xv /usr/lib/spark/bin/spark-shell and tell me the output.
        Hide
        asanjar Amir Sanjar added a comment - - edited

        return from "bash -xv /usr/lib/spark/bin/spark-shell":

        #!/usr/bin/env bash
        
        #
        # Licensed to the Apache Software Foundation (ASF) under one or more
        # contributor license agreements.  See the NOTICE file distributed with
        # this work for additional information regarding copyright ownership.
        # The ASF licenses this file to You under the Apache License, Version 2.0
        # (the "License"); you may not use this file except in compliance with
        # the License.  You may obtain a copy of the License at
        #
        #    http://www.apache.org/licenses/LICENSE-2.0
        #
        # Unless required by applicable law or agreed to in writing, software
        # distributed under the License is distributed on an "AS IS" BASIS,
        # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
        # See the License for the specific language governing permissions and
        # limitations under the License.
        #
        
        #
        # Shell script for starting the Spark Shell REPL
        
        cygwin=false
        + cygwin=false
        case "`uname`" in
          CYGWIN*) cygwin=true;;
        esac
        + case "`uname`" in
        uname
        ++ uname
        
        # Enter posix mode for bash
        set -o posix
        + set -o posix
        
        if [ -z "${SPARK_HOME}" ]; then
          export SPARK_HOME="$(cd "`dirname "$0"`"/..; pwd)"
        fi
        + '[' -z '' ']'
        cd "`dirname "$0"`"/..; pwd
        dirname "$0"
        +++ dirname /usr/lib/spark/bin/spark-shell
        ++ cd /usr/lib/spark/bin/..
        ++ pwd
        + export SPARK_HOME=/usr/lib/spark
        + SPARK_HOME=/usr/lib/spark
        
        export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]"
        + export '_SPARK_CMD_USAGE=Usage: ./bin/spark-shell [options]'
        + _SPARK_CMD_USAGE='Usage: ./bin/spark-shell [options]'
        
        # SPARK-4161: scala does not assume use of the java classpath,
        # so we need to add the "-Dscala.usejavacp=true" flag manually. We
        # do this specifically for the Spark shell because the scala REPL
        # has its own class loader, and any additional classpath specified
        # through spark.driver.extraClassPath is not automatically propagated.
        SPARK_SUBMIT_OPTS="$SPARK_SUBMIT_OPTS -Dscala.usejavacp=true"
        + SPARK_SUBMIT_OPTS=' -Dscala.usejavacp=true'
        
        function main() {
          if $cygwin; then
            # Workaround for issue involving JLine and Cygwin
            # (see http://sourceforge.net/p/jline/bugs/40/).
            # If you're using the Mintty terminal emulator in Cygwin, may need to set the
            # "Backspace sends ^H" setting in "Keys" section of the Mintty options
            # (see https://github.com/sbt/sbt/issues/562).
            stty -icanon min 1 -echo > /dev/null 2>&1
            export SPARK_SUBMIT_OPTS="$SPARK_SUBMIT_OPTS -Djline.terminal=unix"
            "${SPARK_HOME}"/bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@"
            stty icanon echo > /dev/null 2>&1
          else
            export SPARK_SUBMIT_OPTS
            "${SPARK_HOME}"/bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@"
          fi
        }
        
        # Copy restore-TTY-on-exit functions from Scala script so spark-shell exits properly even in
        # binary distribution of Spark where Scala is not installed
        exit_status=127
        + exit_status=127
        saved_stty=""
        + saved_stty=
        
        # restore stty settings (echo in particular)
        function restoreSttySettings() {
          stty $saved_stty
          saved_stty=""
        }
        
        function onExit() {
          if [[ "$saved_stty" != "" ]]; then
            restoreSttySettings
          fi
          exit $exit_status
        }
        
        # to reenable echo if we are interrupted before completing.
        trap onExit INT
        + trap onExit INT
        
        # save terminal settings
        saved_stty=$(stty -g 2>/dev/null)
        stty -g 2>/dev/null
        ++ stty -g
        + saved_stty=500:5:bf:8a3b:3:1c:7f:15:4:0:1:0:11:13:1a:0:12:f:17:16:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0
        # clear on error so we don't later try to restore them
        if [[ ! $? ]]; then
          saved_stty=""
        fi
        + [[ ! -n 0 ]]
        
        main "$@"
        + main
        + false
        + export SPARK_SUBMIT_OPTS
        + /usr/lib/spark/bin/spark-submit --class org.apache.spark.repl.Main --name 'Spark shell'
        Setting default log level to "WARN".
        To adjust logging level use sc.setLogLevel(newLevel).
        17/01/03 11:59:39 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
        17/01/03 11:59:39 WARN spark.SparkContext: Use an existing SparkContext, some configuration may not take effect.
        Spark context Web UI available at http://172.17.0.2:4040
        Spark context available as 'sc' (master = local[*], app id = local-1483444779817).
        Spark session available as 'spark'.
        Welcome to
              ____              __
             / __/__  ___ _____/ /__
            _\ \/ _ \/ _ `/ __/  '_/
           /___/ .__/\_,_/_/ /_/\_\   version 2.0.2
              /_/
                 
        Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_111)
        Type in expressions to have them evaluated.
        Type :help for more information.
        
        scala> 
        :
         bash -xv /usr/lib/spark/bin/spark-shell                 
        #!/usr/bin/env bash
        
        #
        # Licensed to the Apache Software Foundation (ASF) under one or more
        # contributor license agreements.  See the NOTICE file distributed with
        # this work for additional information regarding copyright ownership.
        # The ASF licenses this file to You under the Apache License, Version 2.0
        # (the "License"); you may not use this file except in compliance with
        # the License.  You may obtain a copy of the License at
        #
        #    http://www.apache.org/licenses/LICENSE-2.0
        #
        # Unless required by applicable law or agreed to in writing, software
        # distributed under the License is distributed on an "AS IS" BASIS,
        # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
        # See the License for the specific language governing permissions and
        # limitations under the License.
        #
        
        #
        # Shell script for starting the Spark Shell REPL
        
        cygwin=false
        + cygwin=false
        case "`uname`" in
          CYGWIN*) cygwin=true;;
        esac
        + case "`uname`" in
        uname
        ++ uname
        
        # Enter posix mode for bash
        set -o posix
        + set -o posix
        
        if [ -z "${SPARK_HOME}" ]; then
          export SPARK_HOME="$(cd "`dirname "$0"`"/..; pwd)"
        fi
        + '[' -z '' ']'
        cd "`dirname "$0"`"/..; pwd
        dirname "$0"
        +++ dirname /usr/lib/spark/bin/spark-shell
        ++ cd /usr/lib/spark/bin/..
        ++ pwd
        + export SPARK_HOME=/usr/lib/spark
        + SPARK_HOME=/usr/lib/spark
        
        export _SPARK_CMD_USAGE="Usage: ./bin/spark-shell [options]"
        + export '_SPARK_CMD_USAGE=Usage: ./bin/spark-shell [options]'
        + _SPARK_CMD_USAGE='Usage: ./bin/spark-shell [options]'
        
        # SPARK-4161: scala does not assume use of the java classpath,
        # so we need to add the "-Dscala.usejavacp=true" flag manually. We
        # do this specifically for the Spark shell because the scala REPL
        # has its own class loader, and any additional classpath specified
        # through spark.driver.extraClassPath is not automatically propagated.
        SPARK_SUBMIT_OPTS="$SPARK_SUBMIT_OPTS -Dscala.usejavacp=true"
        + SPARK_SUBMIT_OPTS=' -Dscala.usejavacp=true'
        
        function main() {
          if $cygwin; then
            # Workaround for issue involving JLine and Cygwin
            # (see http://sourceforge.net/p/jline/bugs/40/).
            # If you're using the Mintty terminal emulator in Cygwin, may need to set the
            # "Backspace sends ^H" setting in "Keys" section of the Mintty options
            # (see https://github.com/sbt/sbt/issues/562).
            stty -icanon min 1 -echo > /dev/null 2>&1
            export SPARK_SUBMIT_OPTS="$SPARK_SUBMIT_OPTS -Djline.terminal=unix"
            "${SPARK_HOME}"/bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@"
            stty icanon echo > /dev/null 2>&1
          else
            export SPARK_SUBMIT_OPTS
            "${SPARK_HOME}"/bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@"
          fi
        }
        
        # Copy restore-TTY-on-exit functions from Scala script so spark-shell exits properly even in
        # binary distribution of Spark where Scala is not installed
        exit_status=127
        + exit_status=127
        saved_stty=""
        + saved_stty=
        
        # restore stty settings (echo in particular)
        function restoreSttySettings() {
          stty $saved_stty
          saved_stty=""
        }
        
        function onExit() {
          if [[ "$saved_stty" != "" ]]; then
            restoreSttySettings
          fi
          exit $exit_status
        }
        
        # to reenable echo if we are interrupted before completing.
        trap onExit INT
        + trap onExit INT
        
        # save terminal settings
        saved_stty=$(stty -g 2>/dev/null)
        stty -g 2>/dev/null
        ++ stty -g
        + saved_stty=500:5:bf:8a3b:3:1c:7f:15:4:0:1:0:11:13:1a:0:12:f:17:16:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0
        # clear on error so we don't later try to restore them
        if [[ ! $? ]]; then
          saved_stty=""
        fi
        + [[ ! -n 0 ]]
        
        main "$@"
        + main
        + false
        + export SPARK_SUBMIT_OPTS
        + /usr/lib/spark/bin/spark-submit --class org.apache.spark.repl.Main --name 'Spark shell'
        Setting default log level to "WARN".
        To adjust logging level use sc.setLogLevel(newLevel).
        17/01/03 11:59:39 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
        17/01/03 11:59:39 WARN spark.SparkContext: Use an existing SparkContext, some configuration may not take effect.
        Spark context Web UI available at http://172.17.0.2:4040
        Spark context available as 'sc' (master = local[*], app id = local-1483444779817).
        Spark session available as 'spark'.
        Welcome to
              ____              __
             / __/__  ___ _____/ /__
            _\ \/ _ \/ _ `/ __/  '_/
           /___/ .__/\_,_/_/ /_/\_\   version 2.0.2
              /_/
                 
        Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_111)
        Type in expressions to have them evaluated.
        Type :help for more information.
        
        scala> 
        
        Show
        asanjar Amir Sanjar added a comment - - edited return from "bash -xv /usr/lib/spark/bin/spark-shell": #!/usr/bin/env bash # # Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License" ); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http: //www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # # # Shell script for starting the Spark Shell REPL cygwin= false + cygwin= false case "`uname`" in CYGWIN*) cygwin= true ;; esac + case "`uname`" in uname ++ uname # Enter posix mode for bash set -o posix + set -o posix if [ -z "${SPARK_HOME}" ]; then export SPARK_HOME= "$(cd " `dirname "$0" ` "/..; pwd)" fi + '[' -z '' ']' cd "`dirname " $0 "`" /..; pwd dirname "$0" +++ dirname /usr/lib/spark/bin/spark-shell ++ cd /usr/lib/spark/bin/.. ++ pwd + export SPARK_HOME=/usr/lib/spark + SPARK_HOME=/usr/lib/spark export _SPARK_CMD_USAGE= "Usage: ./bin/spark-shell [options]" + export '_SPARK_CMD_USAGE=Usage: ./bin/spark-shell [options]' + _SPARK_CMD_USAGE='Usage: ./bin/spark-shell [options]' # SPARK-4161: scala does not assume use of the java classpath, # so we need to add the "-Dscala.usejavacp= true " flag manually. We # do this specifically for the Spark shell because the scala REPL # has its own class loader, and any additional classpath specified # through spark.driver.extraClassPath is not automatically propagated. SPARK_SUBMIT_OPTS= "$SPARK_SUBMIT_OPTS -Dscala.usejavacp= true " + SPARK_SUBMIT_OPTS=' -Dscala.usejavacp= true ' function main() { if $cygwin; then # Workaround for issue involving JLine and Cygwin # (see http: //sourceforge.net/p/jline/bugs/40/). # If you're using the Mintty terminal emulator in Cygwin, may need to set the # "Backspace sends ^H" setting in "Keys" section of the Mintty options # (see https: //github.com/sbt/sbt/issues/562). stty -icanon min 1 -echo > /dev/ null 2>&1 export SPARK_SUBMIT_OPTS= "$SPARK_SUBMIT_OPTS -Djline.terminal=unix" "${SPARK_HOME}" /bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@" stty icanon echo > /dev/ null 2>&1 else export SPARK_SUBMIT_OPTS "${SPARK_HOME}" /bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@" fi } # Copy restore-TTY-on-exit functions from Scala script so spark-shell exits properly even in # binary distribution of Spark where Scala is not installed exit_status=127 + exit_status=127 saved_stty="" + saved_stty= # restore stty settings (echo in particular) function restoreSttySettings() { stty $saved_stty saved_stty="" } function onExit() { if [[ "$saved_stty" != "" ]]; then restoreSttySettings fi exit $exit_status } # to reenable echo if we are interrupted before completing. trap onExit INT + trap onExit INT # save terminal settings saved_stty=$(stty -g 2>/dev/ null ) stty -g 2>/dev/ null ++ stty -g + saved_stty=500:5:bf:8a3b:3:1c:7f:15:4:0:1:0:11:13:1a:0:12:f:17:16:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0 # clear on error so we don't later try to restore them if [[ ! $? ]]; then saved_stty="" fi + [[ ! -n 0 ]] main "$@" + main + false + export SPARK_SUBMIT_OPTS + /usr/lib/spark/bin/spark-submit --class org.apache.spark.repl.Main --name 'Spark shell' Setting default log level to "WARN" . To adjust logging level use sc.setLogLevel(newLevel). 17/01/03 11:59:39 WARN util.NativeCodeLoader: Unable to load native -hadoop library for your platform... using builtin-java classes where applicable 17/01/03 11:59:39 WARN spark.SparkContext: Use an existing SparkContext, some configuration may not take effect. Spark context Web UI available at http: //172.17.0.2:4040 Spark context available as 'sc' (master = local[*], app id = local-1483444779817). Spark session available as 'spark'. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.0.2 /_/ Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_111) Type in expressions to have them evaluated. Type :help for more information. scala> : bash -xv /usr/lib/spark/bin/spark-shell #!/usr/bin/env bash # # Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License" ); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http: //www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # # # Shell script for starting the Spark Shell REPL cygwin= false + cygwin= false case "`uname`" in CYGWIN*) cygwin= true ;; esac + case "`uname`" in uname ++ uname # Enter posix mode for bash set -o posix + set -o posix if [ -z "${SPARK_HOME}" ]; then export SPARK_HOME= "$(cd " `dirname "$0" ` "/..; pwd)" fi + '[' -z '' ']' cd "`dirname " $0 "`" /..; pwd dirname "$0" +++ dirname /usr/lib/spark/bin/spark-shell ++ cd /usr/lib/spark/bin/.. ++ pwd + export SPARK_HOME=/usr/lib/spark + SPARK_HOME=/usr/lib/spark export _SPARK_CMD_USAGE= "Usage: ./bin/spark-shell [options]" + export '_SPARK_CMD_USAGE=Usage: ./bin/spark-shell [options]' + _SPARK_CMD_USAGE='Usage: ./bin/spark-shell [options]' # SPARK-4161: scala does not assume use of the java classpath, # so we need to add the "-Dscala.usejavacp= true " flag manually. We # do this specifically for the Spark shell because the scala REPL # has its own class loader, and any additional classpath specified # through spark.driver.extraClassPath is not automatically propagated. SPARK_SUBMIT_OPTS= "$SPARK_SUBMIT_OPTS -Dscala.usejavacp= true " + SPARK_SUBMIT_OPTS=' -Dscala.usejavacp= true ' function main() { if $cygwin; then # Workaround for issue involving JLine and Cygwin # (see http: //sourceforge.net/p/jline/bugs/40/). # If you're using the Mintty terminal emulator in Cygwin, may need to set the # "Backspace sends ^H" setting in "Keys" section of the Mintty options # (see https: //github.com/sbt/sbt/issues/562). stty -icanon min 1 -echo > /dev/ null 2>&1 export SPARK_SUBMIT_OPTS= "$SPARK_SUBMIT_OPTS -Djline.terminal=unix" "${SPARK_HOME}" /bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@" stty icanon echo > /dev/ null 2>&1 else export SPARK_SUBMIT_OPTS "${SPARK_HOME}" /bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@" fi } # Copy restore-TTY-on-exit functions from Scala script so spark-shell exits properly even in # binary distribution of Spark where Scala is not installed exit_status=127 + exit_status=127 saved_stty="" + saved_stty= # restore stty settings (echo in particular) function restoreSttySettings() { stty $saved_stty saved_stty="" } function onExit() { if [[ "$saved_stty" != "" ]]; then restoreSttySettings fi exit $exit_status } # to reenable echo if we are interrupted before completing. trap onExit INT + trap onExit INT # save terminal settings saved_stty=$(stty -g 2>/dev/ null ) stty -g 2>/dev/ null ++ stty -g + saved_stty=500:5:bf:8a3b:3:1c:7f:15:4:0:1:0:11:13:1a:0:12:f:17:16:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0 # clear on error so we don't later try to restore them if [[ ! $? ]]; then saved_stty="" fi + [[ ! -n 0 ]] main "$@" + main + false + export SPARK_SUBMIT_OPTS + /usr/lib/spark/bin/spark-submit --class org.apache.spark.repl.Main --name 'Spark shell' Setting default log level to "WARN" . To adjust logging level use sc.setLogLevel(newLevel). 17/01/03 11:59:39 WARN util.NativeCodeLoader: Unable to load native -hadoop library for your platform... using builtin-java classes where applicable 17/01/03 11:59:39 WARN spark.SparkContext: Use an existing SparkContext, some configuration may not take effect. Spark context Web UI available at http: //172.17.0.2:4040 Spark context available as 'sc' (master = local[*], app id = local-1483444779817). Spark session available as 'spark'. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.0.2 /_/ Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_111) Type in expressions to have them evaluated. Type :help for more information. scala>
        Hide
        asanjar Amir Sanjar added a comment -

        dpkg -V spark-core
        5???? c /etc/spark/conf.dist/spark-env.sh

        Show
        asanjar Amir Sanjar added a comment - dpkg -V spark-core 5 ???? c /etc/spark/conf.dist/spark-env.sh
        Hide
        oflebbe Olaf Flebbe added a comment -

        Thats funny my spark-shell command reads

        if [ -z "${SPARK_HOME}" ]; then
          source "$(dirname "$0")"/find-spark-home
        fi
        

        yours

        export SPARK_HOME="$(cd "`dirname "$0"`"/..; pwd)"
        

        WTF? What OS is that ?

        Show
        oflebbe Olaf Flebbe added a comment - Thats funny my spark-shell command reads if [ -z "${SPARK_HOME}" ]; then source "$(dirname " $0 ")" /find-spark-home fi yours export SPARK_HOME= "$(cd " `dirname "$0" ` "/..; pwd)" WTF? What OS is that ?
        Hide
        oflebbe Olaf Flebbe added a comment - - edited

        AH I got it. I am already looking at spark-2.1.0 ! You are using spark-2.0.2 !

        The spark devs broke the command.

        Show
        oflebbe Olaf Flebbe added a comment - - edited AH I got it. I am already looking at spark-2.1.0 ! You are using spark-2.0.2 ! The spark devs broke the command.
        Hide
        asanjar Amir Sanjar added a comment -

        ooh, I just upgraded Bigtop Spark to 2.1, should I undo?

        Show
        asanjar Amir Sanjar added a comment - ooh, I just upgraded Bigtop Spark to 2.1, should I undo?
        Hide
        oflebbe Olaf Flebbe added a comment -

        Please propose a patch here: Add find-spark-home to the files packaged.

        Show
        oflebbe Olaf Flebbe added a comment - Please propose a patch here: Add find-spark-home to the files packaged.
        Hide
        asanjar Amir Sanjar added a comment -

        will do

        Show
        asanjar Amir Sanjar added a comment - will do

          People

          • Assignee:
            asanjar Amir Sanjar
            Reporter:
            oflebbe Olaf Flebbe
          • Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development