Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-26941

incorrect computation of maxNumExecutorFailures in ApplicationMaster for streaming

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 2.1.0, 2.4.0
    • 3.0.0
    • Spark Core, YARN
    • None

    Description

      Currently, when enabled streaming dynamic allocation for streaming applications, the maxNumExecutorFailures in ApplicationMaster is still computed with `spark.dynamicAllocation.maxExecutors`. 

      Actually, we should consider `spark.streaming.dynamicAllocation.maxExecutors` instead.

      Related codes:

      private val maxNumExecutorFailures = {
        val effectiveNumExecutors =
          if (Utils.isStreamingDynamicAllocationEnabled(sparkConf)) {
            sparkConf.get(STREAMING_DYN_ALLOCATION_MAX_EXECUTORS)
          } else if (Utils.isDynamicAllocationEnabled(sparkConf)) {
            sparkConf.get(DYN_ALLOCATION_MAX_EXECUTORS)
          } else {
            sparkConf.get(EXECUTOR_INSTANCES).getOrElse(0)
          }
        // By default, effectiveNumExecutors is Int.MaxValue if dynamic allocation is enabled. We need
        // avoid the integer overflow here.
        val defaultMaxNumExecutorFailures = math.max(3,
          if (effectiveNumExecutors > Int.MaxValue / 2) Int.MaxValue else (2 * effectiveNumExecutors))
      
        sparkConf.get(MAX_EXECUTOR_FAILURES).getOrElse(defaultMaxNumExecutorFailures)
      

      Attachments

        Issue Links

          Activity

            People

              liupengcheng liupengcheng
              liupengcheng liupengcheng
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: