Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-22708

spark on yarn error but Final app status: SUCCEEDED, exitCode: 0

    XMLWordPrintableJSON

Details

    • Question
    • Status: Resolved
    • Major
    • Resolution: Not A Problem
    • 2.2.0
    • None
    • Spark Core, YARN
    • None

    Description

      I got log :

      17/12/06 18:14:59 INFO state.StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
      17/12/06 18:15:01 INFO util.Version: Elasticsearch Hadoop v6.0.0 [8b59a8f82d]
      17/12/06 18:15:02 INFO httpclient.HttpMethodDirector: I/O exception (java.net.ConnectException) caught when processing request: Connection refused (Connection refused)
      17/12/06 18:15:02 INFO httpclient.HttpMethodDirector: Retrying request
      17/12/06 18:15:02 INFO httpclient.HttpMethodDirector: I/O exception (java.net.ConnectException) caught when processing request: Connection refused (Connection refused)
      17/12/06 18:15:02 INFO httpclient.HttpMethodDirector: Retrying request
      17/12/06 18:15:02 INFO httpclient.HttpMethodDirector: I/O exception (java.net.ConnectException) caught when processing request: Connection refused (Connection refused)
      17/12/06 18:15:02 INFO httpclient.HttpMethodDirector: Retrying request
      17/12/06 18:15:02 ERROR rest.NetworkClient: Node [192.168.200.154:9200] failed (Connection refused (Connection refused)); no other nodes left - aborting...
      17/12/06 18:15:02 ERROR dispatcher.StrategyDispatcher: 调用链路异常
      org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Cannot detect ES version - typically this happens if the network/Elasticsearch cluster is not accessible or when targeting a WAN/Cloud instance without the proper setting 'es.nodes.wan.only'
      	at org.elasticsearch.hadoop.rest.InitializationUtils.discoverEsVersion(InitializationUtils.java:327)
      	at org.elasticsearch.spark.sql.SchemaUtils$.discoverMappingAndGeoFields(SchemaUtils.scala:98)
      	at org.elasticsearch.spark.sql.SchemaUtils$.discoverMapping(SchemaUtils.scala:91)
      	at org.elasticsearch.spark.sql.ElasticsearchRelation.lazySchema$lzycompute(DefaultSource.scala:196)
      	at org.elasticsearch.spark.sql.ElasticsearchRelation.lazySchema(DefaultSource.scala:196)
      	at org.elasticsearch.spark.sql.ElasticsearchRelation$$anonfun$schema$1.apply(DefaultSource.scala:200)
      	at org.elasticsearch.spark.sql.ElasticsearchRelation$$anonfun$schema$1.apply(DefaultSource.scala:200)
      	at scala.Option.getOrElse(Option.scala:121)
      	at org.elasticsearch.spark.sql.ElasticsearchRelation.schema(DefaultSource.scala:200)
      	at org.apache.spark.sql.execution.datasources.LogicalRelation$.apply(LogicalRelation.scala:77)
      	at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:415)
      	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:172)
      	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:156)
      	at streaming.core.compositor.spark.source.MultiSQLSourceCompositor$$anonfun$result$1.apply(MultiSQLSourceCompositor.scala:37)
      	at streaming.core.compositor.spark.source.MultiSQLSourceCompositor$$anonfun$result$1.apply(MultiSQLSourceCompositor.scala:27)
      	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
      	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
      	at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
      	at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
      	at streaming.core.compositor.spark.source.MultiSQLSourceCompositor.result(MultiSQLSourceCompositor.scala:27)
      	at streaming.core.strategy.SparkStreamingStrategy.result(SparkStreamingStrategy.scala:52)
      	at serviceframework.dispatcher.StrategyDispatcher$$anonfun$dispatch$2.apply(StrategyDispatcher.scala:65)
      	at serviceframework.dispatcher.StrategyDispatcher$$anonfun$dispatch$2.apply(StrategyDispatcher.scala:63)
      	at scala.collection.immutable.List.flatMap(List.scala:327)
      	at serviceframework.dispatcher.StrategyDispatcher.dispatch(StrategyDispatcher.scala:62)
      	at streaming.core.strategy.platform.PlatformManager$$anonfun$run$3.apply(PlatformManager.scala:120)
      	at streaming.core.strategy.platform.PlatformManager$$anonfun$run$3.apply(PlatformManager.scala:118)
      	at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
      	at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
      	at streaming.core.strategy.platform.PlatformManager.run(PlatformManager.scala:117)
      	at streaming.core.StreamingApp$.main(StreamingApp.scala:14)
      	at streaming.core.StreamingApp.main(StreamingApp.scala)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:498)
      	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:635)
      Caused by: org.elasticsearch.hadoop.rest.EsHadoopNoNodesLeftException: Connection error (check network and/or proxy settings)- all nodes failed; tried [[192.168.200.154:9200]] 
      	at org.elasticsearch.hadoop.rest.NetworkClient.execute(NetworkClient.java:149)
      	at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:466)
      	at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:430)
      	at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:434)
      	at org.elasticsearch.hadoop.rest.RestClient.get(RestClient.java:155)
      	at org.elasticsearch.hadoop.rest.RestClient.remoteEsVersion(RestClient.java:660)
      	at org.elasticsearch.hadoop.rest.InitializationUtils.discoverEsVersion(InitializationUtils.java:320)
      	... 36 more
      17/12/06 18:15:02 INFO yarn.ApplicationMaster: Final app status: SUCCEEDED, exitCode: 0
      17/12/06 18:15:02 INFO spark.SparkContext: Invoking stop() from shutdown hook
      17/12/06 18:15:02 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.13.150:45073
      17/12/06 18:15:02 INFO yarn.YarnAllocator: Driver requested a total number of 0 executor(s).
      17/12/06 18:15:02 INFO cluster.YarnClusterSchedulerBackend: Shutting down all executors
      17/12/06 18:15:02 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down
      17/12/06 18:15:02 INFO cluster.SchedulerExtensionServices: Stopping SchedulerExtensionServices
      (serviceOption=None,
       services=List(),
       started=false)
      17/12/06 18:15:02 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
      17/12/06 18:15:02 INFO memory.MemoryStore: MemoryStore cleared
      17/12/06 18:15:02 INFO storage.BlockManager: BlockManager stopped
      17/12/06 18:15:02 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
      17/12/06 18:15:02 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
      17/12/06 18:15:02 INFO spark.SparkContext: Successfully stopped SparkContext
      17/12/06 18:15:02 INFO yarn.ApplicationMaster: Unregistering ApplicationMaster with SUCCEEDED
      17/12/06 18:15:02 INFO impl.AMRMClientImpl: Waiting for application to be successfully unregistered.
      17/12/06 18:15:02 INFO yarn.ApplicationMaster: Deleting staging directory hdfs://nameservice1/user/spark2/.sparkStaging/application_1511515104748_13554
      17/12/06 18:15:02 INFO util.ShutdownHookManager: Shutdown hook called
      

      But

      17/12/06 10:45:15 INFO yarn.ApplicationMaster: Final app status: SUCCEEDED, exitCode: 0
      

      spark version 2.2.0 yarn version 2.6

      Attachments

        Activity

          People

            Unassigned Unassigned
            mvpanswer7 mvpanswer7
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: