Uploaded image for project: 'Zeppelin'
  1. Zeppelin
  2. ZEPPELIN-1741

JAR's specified with spark.jars in spark-defaults.conf does not affect %pyspark interpreter

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 0.6.2
    • None
    • Interpreters
    • None
    • Zeppelin 0.6.2
      Spark 1.6.2

    Description

      If you specify JAR's with spark.jars in your spark-defaults.conf, the %pyspark interpreter will not load these JARs.

      Currently, a note in the Spark interpreter documentation says, "Note that adding jar to pyspark is only availabe via %dep interpreter at the moment."

      This is undesirable for two reasons:

      1) %dep is deprecated
      2) Sysadmins configure spark-defaults.conf, and expect these settings to be honored however Spark is executed.

      As a Zeppelin user, I expect that if I configure JAR's in spark-defaults.conf, these JAR's will be available when the %pyspark interpreter is run.

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              jerzygangi JJ Gangi
              Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

              Dates

                Created:
                Updated: