Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Duplicate
-
0.8.0
-
None
-
None
-
None
Description
hyiHi,
Previously I was able to have code like this in my 0.7.3 Spark Interpreter (YARN cluster):
val word2Vec = new Word2Vec() .setInputCol("filtered") .setOutputCol("word2vec") .setVectorSize(100) .setMinCount(10) .setMaxIter(20)
But the same code in Zeppelin 0.8 on my local machine gives me this error (I am testing the new release before I upgrade the one on the cluster)
<console>:1: error: illegal start of definition
NOTE: I am raising this issue because I saw a similar issue being closed just because there is a workaround of putting '.' at the end of each statement! Which is not acceptable to ask all of our users to change 100s lines of code they have written in Zeppelin or copy-pasting from IntelliJ. The reason that it wasn't working in 0.6, something fixed it in 0.7.3, and it broke again in 0.8.x is not something I can't tell the data scientists using Zeppelin.
So, please no workaround like wrapping it in something or putting the '.' at the end of each line. It's just not possible to ask users to spent so much time re-formatting their code.
Many thanks.
UPDATE: I have learned that setting "zeppelin.spark.useNew" false will make it possible to re-use the old codes. Still, would be great if we can have it set to "true" to take advantage of new features in the new SparkInterpreter.