Uploaded image for project: 'Zeppelin'
  1. Zeppelin
  2. ZEPPELIN-1263

Should specify zeppelin's spark configuration through --conf arguments of spark-submit

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 0.6.0
    • 0.8.0
    • None
    • None

    Description

      For now, we load zeppelin's spark configuration at runtime in the RemoteInterpreter Process rather than loading them before starting the process (through --conf of spark-submit). It is fine for most of spark configuration, but for some configuration, it would introduce some weird issues. Like ZEPPELIN-1242, and if you specify spark.master as yarn-client in spark-defaults.conf but specify spark.master as local in zeppelin side, you will see the spark interpreter fail to start due to this inconsistency. Another case is that spark.driver.memory won't take effect.
      So I propose to specify zeppelin's spark configuration through --conf arguments of spark-submit

      Attachments

        Issue Links

          Activity

            People

              zjffdu Jeff Zhang
              zjffdu Jeff Zhang
              Votes:
              2 Vote for this issue
              Watchers:
              7 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: