Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-46681

Refactor `ExecutorFailureTracker#maxNumExecutorFailures` to avoid unnecessary computations when `MAX_EXECUTOR_FAILURES` is configured

    XMLWordPrintableJSON

Details

    Description

      def maxNumExecutorFailures(sparkConf: SparkConf): Int = {
        val effectiveNumExecutors =
          if (Utils.isStreamingDynamicAllocationEnabled(sparkConf)) {
            sparkConf.get(STREAMING_DYN_ALLOCATION_MAX_EXECUTORS)
          } else if (Utils.isDynamicAllocationEnabled(sparkConf)) {
            sparkConf.get(DYN_ALLOCATION_MAX_EXECUTORS)
          } else {
            sparkConf.get(EXECUTOR_INSTANCES).getOrElse(0)
          }
        // By default, effectiveNumExecutors is Int.MaxValue if dynamic allocation is enabled. We need
        // avoid the integer overflow here.
        val defaultMaxNumExecutorFailures = math.max(3,
          if (effectiveNumExecutors > Int.MaxValue / 2) Int.MaxValue else 2 * effectiveNumExecutors)
      
        sparkConf.get(MAX_EXECUTOR_FAILURES).getOrElse(defaultMaxNumExecutorFailures)
      } 

      The result of defaultMaxNumExecutorFailures is calculated first, even if MAX_EXECUTOR_FAILURES is configured now
       
       
       
       
       

      Attachments

        Issue Links

          Activity

            People

              LuciferYang Yang Jie
              LuciferYang Yang Jie
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: