Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-1944

Document --verbose in spark-shell -h



    • Type: Documentation
    • Status: Resolved
    • Priority: Minor
    • Resolution: Fixed
    • Affects Version/s: 1.0.0
    • Fix Version/s: 1.0.1, 1.1.0
    • Component/s: Spark Core
    • Labels:


      The below help for spark-submit should make mention of the --verbose option

      aash@aash-mbp ~/git/spark$ ./bin/spark-submit -h
      Usage: spark-submit [options] <app jar> [app options]
        --master MASTER_URL         spark://host:port, mesos://host:port, yarn, or local.
        --deploy-mode DEPLOY_MODE   Mode to deploy the app in, either 'client' or 'cluster'.
        --class CLASS_NAME          Name of your app's main class (required for Java apps).
        --arg ARG                   Argument to be passed to your application's main class. This
                                    option can be specified multiple times for multiple args.
        --name NAME                 The name of your application (Default: 'Spark').
        --jars JARS                 A comma-separated list of local jars to include on the
                                    driver classpath and that SparkContext.addJar will work
                                    with. Doesn't work on standalone with 'cluster' deploy mode.
        --files FILES               Comma separated list of files to be placed in the working dir
                                    of each executor.
        --properties-file FILE      Path to a file from which to load extra properties. If not
                                    specified, this will look for conf/spark-defaults.conf.
        --driver-memory MEM         Memory for driver (e.g. 1000M, 2G) (Default: 512M).
        --driver-java-options       Extra Java options to pass to the driver
        --driver-library-path       Extra library path entries to pass to the driver
        --driver-class-path         Extra class path entries to pass to the driver. Note that
                                    jars added with --jars are automatically included in the
        --executor-memory MEM       Memory per executor (e.g. 1000M, 2G) (Default: 1G).
       Spark standalone with cluster deploy mode only:
        --driver-cores NUM          Cores for driver (Default: 1).
        --supervise                 If given, restarts the driver on failure.
       Spark standalone and Mesos only:
        --total-executor-cores NUM  Total cores for all executors.
        --executor-cores NUM        Number of cores per executor (Default: 1).
        --queue QUEUE_NAME          The YARN queue to submit to (Default: 'default').
        --num-executors NUM         Number of executors to (Default: 2).
        --archives ARCHIVES         Comma separated list of archives to be extracted into the
                                    working dir of each executor.
      aash@aash-mbp ~/git/spark$




            • Assignee:
              ash211@gmail.com Andrew Ash
              aash Andrew Ash
            • Votes:
              0 Vote for this issue
              2 Start watching this issue


              • Created: