Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-4230

Doc for spark.default.parallelism is incorrect

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 1.1.0
    • Fix Version/s: 1.2.0
    • Component/s: Spark Core
    • Labels:
      None
    • Target Version/s:

      Description

      The default default parallelism for shuffle transformations is actually the maximum number of partitions in dependent RDDs.

      Should also probably be clear about what SparkContext.defaultParallelism is used for

        Attachments

          Activity

            People

            • Assignee:
              sandyr Sandy Ryza
              Reporter:
              sandyr Sandy Ryza
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: