Details
-
Bug
-
Status: Open
-
Major
-
Resolution: Unresolved
-
0.6.2
-
None
-
None
-
Zeppelin 0.6.2
Spark 1.6.2
Description
If you specify JAR's with spark.jars in your spark-defaults.conf, the %pyspark interpreter will not load these JARs.
Currently, a note in the Spark interpreter documentation says, "Note that adding jar to pyspark is only availabe via %dep interpreter at the moment."
This is undesirable for two reasons:
1) %dep is deprecated
2) Sysadmins configure spark-defaults.conf, and expect these settings to be honored however Spark is executed.
As a Zeppelin user, I expect that if I configure JAR's in spark-defaults.conf, these JAR's will be available when the %pyspark interpreter is run.
Attachments
Issue Links
- relates to
-
ZEPPELIN-1883 Can't import packages requested by SPARK_SUBMIT_OPTION in pyspark
- Resolved