Uploaded image for project: 'Zeppelin'
  1. Zeppelin
  2. ZEPPELIN-5397

SPARK interpreter not starting

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 0.9.0
    • None
    • interpreter-launcher, spark
    • None
    • We are using docker to run zeppelin. Both zeppelin and spark are installed inside the same container running debian buster. We are running mesos in a separate cluster and setting SPARK_MASTER to point to the mesos cluster.

    Description

      We are setting following SPARK_SUBMIT_OPTIONS in zeppelin-env.sh file. 

       

      export SPARK_SUBMIT_OPTIONS="$SPARK_SUBMIT_OPTIONS --conf 'spark.driver.extraJavaOptions=-Dcom.example.env=production -Dcom.example.role=zeppelin'"
      export SPARK_SUBMIT_OPTIONS="$SPARK_SUBMIT_OPTIONS --conf 'spark.executor.extraJavaOptions=-Dcom.example.env=production -Dcom.example.role=zeppelin -Dfile.encoding=UTF-8'"
      

      The spark interpreter is failing to start with error

       

      Error: Unrecognized option: -Dcom.example.role='
      

      We investigated the issue and the issue seems to be in the bin/interpreter.sh script which is failing to parse the SPARK_SUBMIT_OPTIONS properly. When the INTERPRETER_RUN_COMMAND is expanded we can see how the SPARK_SUBMIT_OPTIONS is interpreted as below by running the interpreter.sh command directly like this

      bash -x /opt/zeppelin/bin/interpreter.sh -d /opt/zeppelin/interpreter/spark -c 172.17.0.2 -p 38923 -r : -i spark-shared_process -l /opt/zeppelin/local-repo/spark -g spark
      

      Excerpt of the output:

      --conf ''\''spark.driver.extraJavaOptions=-Dcom.example.env=production' '-Dcom.example.role=zeppelin'\''' --conf ''\''spark.executor.extraJavaOptions=-Dcom.example.env=production' -Dcom.example.role=zeppelin '-Dfile.encoding=UTF-8'\'''
      

      Upon further investigation we found the issue is the whitespace between the Dcom.example.env=production and Dcom.example.role in spark.driver.extraJavaOptions  and similarly in spark.executor.extraJavaOptions.

      This issue was not there in a SNAPSHOT version of 0.9.0 (not able to find that anymore). Attaching the interpreter.sh from that version which is working.interpreter.sh

       

      Attachments

        1. interpreter.sh
          11 kB
          Nilanjan Roy

        Issue Links

          Activity

            People

              Unassigned Unassigned
              nilanjan1 Nilanjan Roy
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:

                Time Tracking

                  Estimated:
                  Original Estimate - Not Specified
                  Not Specified
                  Remaining:
                  Remaining Estimate - 0h
                  0h
                  Logged:
                  Time Spent - 5h 40m
                  5h 40m