Details
-
Bug
-
Status: Open
-
Major
-
Resolution: Unresolved
-
0.8.0
-
None
-
None
-
tested with zeppelin-0.8.0-rc2 from https://dist.apache.org/repos/dist/dev/zeppelin/zeppelin-0.8.0-rc2/
and with version which was build from sources (branch-0.8 a88e4679a2f28a914fa181ad2df55e3744a8ff6b) by this command:
```
mvn clean package -Pbuild-distr -DskipTests -Pspark-2.1 -Phadoop-2.4 -Pyarn -Ppyspark -Psparkr -Pr -Pscala-2.11 -Pcassandra-spark-1.5
```at 0.7.3 this functionality works well
tested with zeppelin-0.8.0-rc2 from https://dist.apache.org/repos/dist/dev/zeppelin/zeppelin-0.8.0-rc2/ and with version which was build from sources (branch-0.8 a88e4679a2f28a914fa181ad2df55e3744a8ff6b) by this command: ``` mvn clean package -Pbuild-distr -DskipTests -Pspark-2.1 -Phadoop-2.4 -Pyarn -Ppyspark -Psparkr -Pr -Pscala-2.11 -Pcassandra-spark-1.5 ``` at 0.7.3 this functionality works well
Description
In many cases (but not at all) dependencies which was added and successfully downloaded at %spark.dep (with interpreter restart) doesn't works at next %spark note.
If same dependencies added at interpreter configuration UI it will works well.
Example:
%spark.dep
z.load("com.lihaoyi:upickle_2.11:0.6.6")
z.load("com.lihaoyi:ujson_2.11:0.6.6")output:
res0: org.apache.zeppelin.dep.Dependency = org.apache.zeppelin.dep.Dependency@73a5892c res0: org.apache.zeppelin.dep.Dependency = org.apache.zeppelin.dep.Dependency@73a5892c%spark
import ujson.Jsval str = """{ "a" : 1 }"""
val json = ujson.read(str)
print(json("a"))
output:
<console>:25: error: not found: value ujson import ujson.Js
Expected:
scala> import ujson.Js
import ujson.Jsscala> val str = """{ "a" : 1 }"""
str: String = { "a" : 1 }scala> val json = ujson.read(str)
json: ujson.Js.Value = {"a":1}scala> print(json("a"))
1