Uploaded image for project: 'Zeppelin'
  1. Zeppelin
  2. ZEPPELIN-5520

zeppelin-0.10-0 failed to start the spark task

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 0.10.0
    • None
    • spark
    • None

    Description

      I describe the error scenario in detail:

      first:i download zeppelin-0.10.0-bin-all.tgz,then tar -zxvf [zeppelin-0.10.0-bin-all.tgz|https://dlcdn.apache.org/zeppelin/zeppelin-0.10.0/zeppelin-0.10.0-bin-all.tgz]

      second:modify zeppelin-env.sh zeppelin-site.xml,

      export JAVA_HOME=/moudle/jdk1.8
      export SPARK_HOME=/moudle/spark-3.0.2
      export HADOOP_HOME=/moudle/hadoop-3.3.0
      export HADOOP_CONF_DIR=/moudle/hadoop-3.3.0/etc/hadoop
      export PYSPARK_PYTHON=/usr/local/python37/bin/python3
      export PYTHONPATH=/usr/local/python37/bin/python3

      when i set zeppelin-0.9.0 like above,everything is ok, but zeppelin-0.10.0 reported the following error.

       

      %spark
      import sqlContext.implicits._
      val df = Seq(
      (1,2,3),
      (1,2,4),
      (1,2,3)
      ).toDF("first_col","second_col","third_col")
      df.show()

      i don't how to solve the problem,so i tried compiling the source code myself.

       

      org.apache.zeppelin.interpreter.InterpreterException: org.apache.zeppelin.interpreter.InterpreterException: org.apache.zeppelin.interpreter.InterpreterException: Fail to open SparkInterpreter at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:833) at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:741) at org.apache.zeppelin.scheduler.Job.run(Job.java:172) at org.apache.zeppelin.scheduler.AbstractScheduler.runJob(AbstractScheduler.java:132) at org.apache.zeppelin.scheduler.ParallelScheduler.lambda$runJobInScheduler$0(ParallelScheduler.java:46) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: org.apache.zeppelin.interpreter.InterpreterException: org.apache.zeppelin.interpreter.InterpreterException: Fail to open SparkInterpreter at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:76) at org.apache.zeppelin.interpreter.Interpreter.getInterpreterInTheSameSessionByClassName(Interpreter.java:322) at org.apache.zeppelin.interpreter.Interpreter.getInterpreterInTheSameSessionByClassName(Interpreter.java:333) at org.apache.zeppelin.spark.SparkSqlInterpreter.open(SparkSqlInterpreter.java:56) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70) ... 8 more Caused by: org.apache.zeppelin.interpreter.InterpreterException: Fail to open SparkInterpreter at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:137) at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70) ... 12 more Caused by: java.lang.NullPointerException at java.util.Arrays.sort(Arrays.java:1438) at scala.tools.nsc.classpath.JFileDirectoryLookup.listChildren(DirectoryClassPath.scala:118) at scala.tools.nsc.classpath.JFileDirectoryLookup.listChildren$(DirectoryClassPath.scala:102) at scala.tools.nsc.classpath.DirectoryClassPath.listChildren(DirectoryClassPath.scala:291) at scala.tools.nsc.classpath.DirectoryClassPath.listChildren(DirectoryClassPath.scala:291) at scala.tools.nsc.classpath.DirectoryLookup.list(DirectoryClassPath.scala:83) at scala.tools.nsc.classpath.DirectoryLookup.list$(DirectoryClassPath.scala:78) at scala.tools.nsc.classpath.DirectoryClassPath.list(DirectoryClassPath.scala:291) at scala.tools.nsc.classpath.AggregateClassPath.$anonfun$list$3(AggregateClassPath.scala:91) at scala.collection.Iterator.foreach(Iterator.scala:941) at scala.collection.Iterator.foreach$(Iterator.scala:941) at scala.collection.AbstractIterator.foreach(Iterator.scala:1429) at scala.collection.IterableLike.foreach(IterableLike.scala:74) at scala.collection.IterableLike.foreach$(IterableLike.scala:73) at scala.collection.AbstractIterable.foreach(Iterable.scala:56) at scala.tools.nsc.classpath.AggregateClassPath.list(AggregateClassPath.scala:87) at scala.tools.nsc.util.ClassPath.list(ClassPath.scala:36) at scala.tools.nsc.util.ClassPath.list$(ClassPath.scala:36) at scala.tools.nsc.classpath.AggregateClassPath.list(AggregateClassPath.scala:30) at scala.tools.nsc.symtab.SymbolLoaders$PackageLoader.doComplete(SymbolLoaders.scala:284) at scala.tools.nsc.symtab.SymbolLoaders$SymbolLoader.complete(SymbolLoaders.scala:230) at scala.reflect.internal.Symbols$Symbol.info(Symbols.scala:1542) at scala.reflect.internal.Mirrors$RootsBase.init(Mirrors.scala:257) at scala.tools.nsc.Global.rootMirror$lzycompute(Global.scala:74) at scala.tools.nsc.Global.rootMirror(Global.scala:72) at scala.tools.nsc.Global.rootMirror(Global.scala:44) at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:295) at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:295) at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1480) at scala.tools.nsc.Global$Run.<init>(Global.scala:1199) at scala.tools.nsc.interpreter.IMain._initialize(IMain.scala:132) at scala.tools.nsc.interpreter.IMain.initializeSynchronous(IMain.scala:154) at org.apache.zeppelin.spark.SparkScala212Interpreter.open(SparkScala212Interpreter.scala:84) at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:121) ... 13 more

       

       

      i don't how to solve the problem,so i tried compiling the source code myself.

      when i compile the source code,i run into the same problem.The command for compiling source code is  mvn clean package -DskipTests -Pspark-3.0 -Pspark-scala-2.12 -Phadoop3 -Phive2 -Pjdbc-hadoop3 -Pflink-112 -Pweb-angular -Phelium-dev -Pexamples -Pbuild-distr -Dhttp.proxyHost=proxy-bj.nioint.com -Dhttp.proxyPort=8080,are you think compile command is ok?

      Have i described the problem clearly so far?Please give me some guidance.Thank you very much.

      Attachments

        1. image-2021-09-16-11-44-12-121.png
          323 kB
          niofei
        2. image-2021-09-16-12-00-07-587.png
          122 kB
          niofei
        3. image-2021-09-16-15-57-11-440.png
          168 kB
          niofei
        4. image-2021-09-16-15-57-47-068.png
          180 kB
          niofei
        5. image-2021-09-16-15-58-53-927.png
          991 kB
          niofei
        6. image-2021-09-16-15-59-25-723.png
          507 kB
          niofei
        7. image-2021-09-16-16-00-03-606.png
          727 kB
          niofei
        8. image-2021-09-16-16-00-24-819.png
          390 kB
          niofei

        Activity

          People

            Unassigned Unassigned
            niofei niofei
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated: