Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-29046

Possible NPE on SQLConf.get when SparkContext is stopping in another thread

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 2.4.0, 2.4.1, 2.4.2, 2.4.3, 2.4.4, 3.0.0
    • 2.4.5, 3.0.0
    • SQL
    • None

    Description

      We encountered NPE in listener code which deals with query plan - and according to the stack trace below, only possible case of NPE is SparkContext._dagScheduler being null, which is only possible while stopping SparkContext (unless null is set from outside).

       

      19/09/11 00:22:24 INFO server.AbstractConnector: Stopped Spark@49d8c117{HTTP/1.1,[http/1.1]}{0.0.0.0:0}19/09/11 00:22:24 INFO server.AbstractConnector: Stopped Spark@49d8c117{HTTP/1.1,[http/1.1]}{0.0.0.0:0}19/09/11 00:22:24 INFO ui.SparkUI: Stopped Spark web UI at http://....:3277019/09/11 00:22:24 INFO cluster.YarnClusterSchedulerBackend: Shutting down all executors19/09/11 00:22:24 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down19/09/11 00:22:24 INFO cluster.SchedulerExtensionServices: Stopping SchedulerExtensionServices(serviceOption=None, services=List(), started=false)19/09/11 00:22:24 WARN sql.SparkExecutionPlanProcessor: Caught exception during parsing eventjava.lang.NullPointerException at org.apache.spark.sql.internal.SQLConf$$anonfun$15.apply(SQLConf.scala:133) at org.apache.spark.sql.internal.SQLConf$$anonfun$15.apply(SQLConf.scala:133) at scala.Option.map(Option.scala:146) at org.apache.spark.sql.internal.SQLConf$.get(SQLConf.scala:133) at org.apache.spark.sql.types.StructType.simpleString(StructType.scala:352) at com.hortonworks.spark.atlas.types.internal$.sparkTableToEntity(internal.scala:102) at com.hortonworks.spark.atlas.types.AtlasEntityUtils$class.tableToEntity(AtlasEntityUtils.scala:62) at com.hortonworks.spark.atlas.sql.CommandsHarvester$.tableToEntity(CommandsHarvester.scala:45) at com.hortonworks.spark.atlas.sql.CommandsHarvester$$anonfun$com$hortonworks$spark$atlas$sql$CommandsHarvester$$discoverInputsEntities$1.apply(CommandsHarvester.scala:240) at com.hortonworks.spark.atlas.sql.CommandsHarvester$$anonfun$com$hortonworks$spark$atlas$sql$CommandsHarvester$$discoverInputsEntities$1.apply(CommandsHarvester.scala:239) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241) at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104) at com.hortonworks.spark.atlas.sql.CommandsHarvester$.com$hortonworks$spark$atlas$sql$CommandsHarvester$$discoverInputsEntities(CommandsHarvester.scala:239) at com.hortonworks.spark.atlas.sql.CommandsHarvester$CreateDataSourceTableAsSelectHarvester$.harvest(CommandsHarvester.scala:104) at com.hortonworks.spark.atlas.sql.SparkExecutionPlanProcessor$$anonfun$2.apply(SparkExecutionPlanProcessor.scala:138) at com.hortonworks.spark.atlas.sql.SparkExecutionPlanProcessor$$anonfun$2.apply(SparkExecutionPlanProcessor.scala:89) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241) at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104) at com.hortonworks.spark.atlas.sql.SparkExecutionPlanProcessor.process(SparkExecutionPlanProcessor.scala:89) at com.hortonworks.spark.atlas.sql.SparkExecutionPlanProcessor.process(SparkExecutionPlanProcessor.scala:63) at com.hortonworks.spark.atlas.AbstractEventProcessor$$anonfun$eventProcess$1.apply(AbstractEventProcessor.scala:72) at com.hortonworks.spark.atlas.AbstractEventProcessor$$anonfun$eventProcess$1.apply(AbstractEventProcessor.scala:71) at scala.Option.foreach(Option.scala:257) at com.hortonworks.spark.atlas.AbstractEventProcessor.eventProcess(AbstractEventProcessor.scala:71) at com.hortonworks.spark.atlas.AbstractEventProcessor$$anon$1.run(AbstractEventProcessor.scala:38)19/09/11 00:22:24 WARN sql.SparkCatalogEventProcessor: Caught exception during parsing eventjava.lang.NullPointerException at org.apache.spark.sql.internal.SQLConf$$anonfun$15.apply(SQLConf.scala:133) at org.apache.spark.sql.internal.SQLConf$$anonfun$15.apply(SQLConf.scala:133) at scala.Option.map(Option.scala:146) at org.apache.spark.sql.internal.SQLConf$.get(SQLConf.scala:133) at org.apache.spark.sql.types.StructType.simpleString(StructType.scala:352) at com.hortonworks.spark.atlas.types.internal$.sparkTableToEntity(internal.scala:102) at com.hortonworks.spark.atlas.types.AtlasEntityUtils$class.sparkTableToEntity(AtlasEntityUtils.scala:69) at com.hortonworks.spark.atlas.sql.SparkCatalogEventProcessor.sparkTableToEntity(SparkCatalogEventProcessor.scala:28) at com.hortonworks.spark.atlas.sql.SparkCatalogEventProcessor.process(SparkCatalogEventProcessor.scala:80) at com.hortonworks.spark.atlas.sql.SparkCatalogEventProcessor.process(SparkCatalogEventProcessor.scala:28) at com.hortonworks.spark.atlas.AbstractEventProcessor$$anonfun$eventProcess$1.apply(AbstractEventProcessor.scala:72) at com.hortonworks.spark.atlas.AbstractEventProcessor$$anonfun$eventProcess$1.apply(AbstractEventProcessor.scala:71) at scala.Option.foreach(Option.scala:257) at com.hortonworks.spark.atlas.AbstractEventProcessor.eventProcess(AbstractEventProcessor.scala:71) at com.hortonworks.spark.atlas.AbstractEventProcessor$$anon$1.run(AbstractEventProcessor.scala:38)19/09/11 00:22:24 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!19/09/11 00:22:24 INFO memory.MemoryStore: MemoryStore cleared19/09/11 00:22:24 INFO storage.BlockManager: BlockManager stopped19/09/11 00:22:24 INFO storage.BlockManagerMaster: BlockManagerMaster stopped19/09/11 00:22:24 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!19/09/11 00:22:24 INFO spark.SparkContext: Successfully stopped SparkContext 

      Attachments

        Issue Links

          Activity

            People

              kabhwan Jungtaek Lim
              kabhwan Jungtaek Lim
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: