Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-12534

Document missing command line options to Spark properties mapping

    Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Minor
    • Resolution: Fixed
    • Affects Version/s: 1.5.2
    • Fix Version/s: 2.0.0
    • Component/s: Deploy, Documentation, YARN
    • Labels:
      None

      Description

      Several Spark properties equivalent to Spark submit command line options are missing.

      The equivalent for spark-submit --num-executors should be
      spark.executor.instances
      When use in SparkConf?
      http://spark.apache.org/docs/latest/running-on-yarn.html

      Could you try setting that with sparkR.init()?

      _____________________________
      From: Franc Carter <franc.carter@gmail.com>
      Sent: Friday, December 25, 2015 9:23 PM
      Subject: number of executors in sparkR.init()
      To: <user@spark.apache.org>

      Hi,

      I'm having trouble working out how to get the number of executors set when using sparkR.init().

      If I start sparkR with

      sparkR --master yarn --num-executors 6

      then I get 6 executors

      However, if start sparkR with

      sparkR

      followed by

      sc <- sparkR.init(master="yarn-client", sparkEnvir=list(spark.num.executors='6'))

      then I only get 2 executors.

      Can anyone point me in the direction of what I might doing wrong ? I need to initialise this was so that rStudio can hook in to SparkR

      thanks


      Franc

        Attachments

          Activity

            People

            • Assignee:
              felixcheung Felix Cheung
              Reporter:
              felixcheung Felix Cheung
            • Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: