XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Invalid
    • 2.2.0
    • None
    • PySpark
    • Spark2.2 with CDH5.12,python3.6.1,java jdk1.8_b31.

    • Important

    Description

      I install spark2.2 following the official steps with CDH5.12.
      Info on Cloudera Manager is okay!
      But I failed to initialize pyspark2.

      My Environment : Python3.6.1,JDK1.8,CDH5.12

      The problem make me crazy for several days.
      And I found no way to solve it.
      Anyone can help me?
      Very thank you!!!

      [hdfs@Master /data/soft/spark2.2]$ pyspark2
      Python 3.6.1 (default, Jul 27 2017, 11:07:01)
      [GCC 4.4.6 20110731 (Red Hat 4.4.6-4)] on linux
      Type "help", "copyright", "credits" or "license" for more information.
      Setting default log level to "WARN".
      To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
      17/07/27 12:02:09 ERROR spark.SparkContext: Error initializing SparkContext.
      org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
      at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:85)
      at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62)
      at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:173)
      at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
      at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
      at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
      at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
      at py4j.Gateway.invoke(Gateway.java:236)
      at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
      at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
      at py4j.GatewayConnection.run(GatewayConnection.java:214)
      at java.lang.Thread.run(Thread.java:748)
      17/07/27 12:02:09 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered!
      17/07/27 12:02:09 ERROR util.Utils: Uncaught exception in thread Thread-2
      java.lang.NullPointerException
      at org.apache.spark.network.shuffle.ExternalShuffleClient.close(ExternalShuffleClient.java:141)
      at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1485)
      at org.apache.spark.SparkEnv.stop(SparkEnv.scala:90)
      at org.apache.spark.SparkContext$$anonfun$stop$11.apply$mcV$sp(SparkContext.scala:1937)
      at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1317)
      at org.apache.spark.SparkContext.stop(SparkContext.scala:1936)
      at org.apache.spark.SparkContext.<init>(SparkContext.scala:587)
      at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
      at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
      at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
      at py4j.Gateway.invoke(Gateway.java:236)
      at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
      at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
      at py4j.GatewayConnection.run(GatewayConnection.java:214)
      at java.lang.Thread.run(Thread.java:748)
      /opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2/python/pyspark/shell.py:52: UserWarning: Fall back to non-hive support because failing to access HiveConf, please make sure you build spark with hive
      warnings.warn("Fall back to non-hive support because failing to access HiveConf, "
      17/07/27 12:02:09 WARN spark.SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243). The other SparkContext was created at:
      org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
      sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
      sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      java.lang.reflect.Constructor.newInstance(Constructor.java:423)
      py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
      py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
      py4j.Gateway.invoke(Gateway.java:236)
      py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
      py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
      py4j.GatewayConnection.run(GatewayConnection.java:214)
      java.lang.Thread.run(Thread.java:748)
      17/07/27 12:02:14 ERROR spark.SparkContext: Error initializing SparkContext.
      org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
      at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:85)
      at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62)
      at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:173)
      at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
      at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
      at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
      at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
      at py4j.Gateway.invoke(Gateway.java:236)
      at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
      at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
      at py4j.GatewayConnection.run(GatewayConnection.java:214)
      at java.lang.Thread.run(Thread.java:748)
      17/07/27 12:02:14 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered!
      17/07/27 12:02:14 ERROR util.Utils: Uncaught exception in thread Thread-2
      java.lang.NullPointerException
      at org.apache.spark.network.shuffle.ExternalShuffleClient.close(ExternalShuffleClient.java:141)
      at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1485)
      at org.apache.spark.SparkEnv.stop(SparkEnv.scala:90)
      at org.apache.spark.SparkContext$$anonfun$stop$11.apply$mcV$sp(SparkContext.scala:1937)
      at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1317)
      at org.apache.spark.SparkContext.stop(SparkContext.scala:1936)
      at org.apache.spark.SparkContext.<init>(SparkContext.scala:587)
      at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
      at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
      at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
      at py4j.Gateway.invoke(Gateway.java:236)
      at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
      at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
      at py4j.GatewayConnection.run(GatewayConnection.java:214)
      at java.lang.Thread.run(Thread.java:748)
      Traceback (most recent call last):
      File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2/python/pyspark/shell.py", line 45, in <module>
      spark = SparkSession.builder\
      File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2/python/pyspark/sql/session.py", line 169, in getOrCreate
      sc = SparkContext.getOrCreate(sparkConf)
      File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2/python/pyspark/context.py", line 334, in getOrCreate
      SparkContext(conf=conf or SparkConf())
      File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2/python/pyspark/context.py", line 118, in _init_
      conf, jsc, profiler_cls)
      File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2/python/pyspark/context.py", line 180, in _do_init
      self._jsc = jsc or self._initialize_context(self._conf._jconf)
      File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2/python/pyspark/context.py", line 273, in _initialize_context
      return self._jvm.JavaSparkContext(jconf)
      File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1401, in _call_
      File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py", line 319, in get_return_value
      py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
      : org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
      at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:85)
      at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62)
      at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:173)
      at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
      at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
      at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
      at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
      at py4j.Gateway.invoke(Gateway.java:236)
      at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
      at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
      at py4j.GatewayConnection.run(GatewayConnection.java:214)
      at java.lang.Thread.run(Thread.java:748)

      During handling of the above exception, another exception occurred:

      Traceback (most recent call last):
      File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2/python/pyspark/shell.py", line 54, in <module>
      spark = SparkSession.builder.getOrCreate()
      File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2/python/pyspark/sql/session.py", line 169, in getOrCreate
      sc = SparkContext.getOrCreate(sparkConf)
      File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2/python/pyspark/context.py", line 334, in getOrCreate
      SparkContext(conf=conf or SparkConf())
      File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2/python/pyspark/context.py", line 118, in _init_
      conf, jsc, profiler_cls)
      File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2/python/pyspark/context.py", line 180, in _do_init
      self._jsc = jsc or self._initialize_context(self._conf._jconf)
      File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2/python/pyspark/context.py", line 273, in _initialize_context
      return self._jvm.JavaSparkContext(jconf)
      File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1401, in _call_
      File "/opt/cloudera/parcels/SPARK2-2.2.0.cloudera1-1.cdh5.12.0.p0.142354/lib/spark2/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py", line 319, in get_return_value
      py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
      : org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
      at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:85)
      at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62)
      at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:173)
      at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
      at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
      at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
      at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
      at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
      at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
      at py4j.Gateway.invoke(Gateway.java:236)
      at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
      at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
      at py4j.GatewayConnection.run(GatewayConnection.java:214)
      at java.lang.Thread.run(Thread.java:748)

      Anyone can help me ?
      I search this question in stackoverflow / google and found nothing.

      Attachments

        Activity

          People

            Unassigned Unassigned
            gumpcheng gumpcheng
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: