Details
-
Bug
-
Status: Open
-
Major
-
Resolution: Unresolved
-
0.10.0
-
None
-
None
Description
I describe the error scenario in detail:
first:i download zeppelin-0.10.0-bin-all.tgz,then tar -zxvf [zeppelin-0.10.0-bin-all.tgz|https://dlcdn.apache.org/zeppelin/zeppelin-0.10.0/zeppelin-0.10.0-bin-all.tgz]
second:modify zeppelin-env.sh zeppelin-site.xml,
export JAVA_HOME=/moudle/jdk1.8
export SPARK_HOME=/moudle/spark-3.0.2
export HADOOP_HOME=/moudle/hadoop-3.3.0
export HADOOP_CONF_DIR=/moudle/hadoop-3.3.0/etc/hadoop
export PYSPARK_PYTHON=/usr/local/python37/bin/python3
export PYTHONPATH=/usr/local/python37/bin/python3
when i set zeppelin-0.9.0 like above,everything is ok, but zeppelin-0.10.0 reported the following error.
%spark
import sqlContext.implicits._
val df = Seq(
(1,2,3),
(1,2,4),
(1,2,3)
).toDF("first_col","second_col","third_col")
df.show()
i don't how to solve the problem,so i tried compiling the source code myself.
org.apache.zeppelin.interpreter.InterpreterException: org.apache.zeppelin.interpreter.InterpreterException: org.apache.zeppelin.interpreter.InterpreterException: Fail to open SparkInterpreter at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:833) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:741) at org.apache.zeppelin.scheduler.Job.run(Job.java:172) at org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:132) at org.apache.zeppelin.scheduler.ParallelScheduler.lambda$runJobInScheduler$0(ParallelScheduler.java:46) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: org.apache.zeppelin.interpreter.InterpreterException: org.apache.zeppelin.interpreter.InterpreterException: Fail to open SparkInterpreter at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76) at org.apache.zeppelin.interpreter.Interpreter.getInterpreterInTheSameSessionByClassName(Interpreter.java:322) at org.apache.zeppelin.interpreter.Interpreter.getInterpreterInTheSameSessionByClassName(Interpreter.java:333) at org.apache.zeppelin.spark.SparkSqlInterpreter.open(SparkSqlInterpreter.java:56) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70) ... 8 more Caused by: org.apache.zeppelin.interpreter.InterpreterException: Fail to open SparkInterpreter at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:137) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70) ... 12 more Caused by: java.lang.NullPointerException at java.util.Arrays.sort(Arrays.java:1438) at scala.tools.nsc.classpath.JFileDirectoryLookup.listChildren(DirectoryClassPath.scala:118) at scala.tools.nsc.classpath.JFileDirectoryLookup.listChildren$(DirectoryClassPath.scala:102) at scala.tools.nsc.classpath.DirectoryClassPath.listChildren(DirectoryClassPath.scala:291) at scala.tools.nsc.classpath.DirectoryClassPath.listChildren(DirectoryClassPath.scala:291) at scala.tools.nsc.classpath.DirectoryLookup.list(DirectoryClassPath.scala:83) at scala.tools.nsc.classpath.DirectoryLookup.list$(DirectoryClassPath.scala:78) at scala.tools.nsc.classpath.DirectoryClassPath.list(DirectoryClassPath.scala:291) at scala.tools.nsc.classpath.AggregateClassPath.$anonfun$list$3(AggregateClassPath.scala:91) at scala.collection.Iterator.foreach(Iterator.scala:941) at scala.collection.Iterator.foreach$(Iterator.scala:941) at scala.collection.AbstractIterator.foreach(Iterator.scala:1429) at scala.collection.IterableLike.foreach(IterableLike.scala:74) at scala.collection.IterableLike.foreach$(IterableLike.scala:73) at scala.collection.AbstractIterable.foreach(Iterable.scala:56) at scala.tools.nsc.classpath.AggregateClassPath.list(AggregateClassPath.scala:87) at scala.tools.nsc.util.ClassPath.list(ClassPath.scala:36) at scala.tools.nsc.util.ClassPath.list$(ClassPath.scala:36) at scala.tools.nsc.classpath.AggregateClassPath.list(AggregateClassPath.scala:30) at scala.tools.nsc.symtab.SymbolLoaders$PackageLoader.doComplete(SymbolLoaders.scala:284) at scala.tools.nsc.symtab.SymbolLoaders$SymbolLoader.complete(SymbolLoaders.scala:230) at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1542) at scala.reflect.internal.Mirrors$RootsBase.init(Mirrors.scala:257) at scala.tools.nsc.Global.rootMirror$lzycompute(Global.scala:74) at scala.tools.nsc.Global.rootMirror(Global.scala:72) at scala.tools.nsc.Global.rootMirror(Global.scala:44) at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:295) at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:295) at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1480) at scala.tools.nsc.Global$Run.<init>(Global.scala:1199) at scala.tools.nsc.interpreter.IMain._initialize(IMain.scala:132) at scala.tools.nsc.interpreter.IMain.initializeSynchronous(IMain.scala:154) at org.apache.zeppelin.spark.SparkScala212Interpreter.open(SparkScala212Interpreter.scala:84) at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:121) ... 13 more
i don't how to solve the problem,so i tried compiling the source code myself.
when i compile the source code,i run into the same problem.The command for compiling source code is mvn clean package -DskipTests -Pspark-3.0 -Pspark-scala-2.12 -Phadoop3 -Phive2 -Pjdbc-hadoop3 -Pflink-112 -Pweb-angular -Phelium-dev -Pexamples -Pbuild-distr -Dhttp.proxyHost=proxy-bj.nioint.com -Dhttp.proxyPort=8080,are you think compile command is ok?
Have i described the problem clearly so far?Please give me some guidance.Thank you very much.