Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-3071

Increase default driver memory

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Closed
    • Major
    • Resolution: Fixed
    • 1.4.2
    • 1.5.0
    • Spark Core
    • None

    Description

      The current default is 512M, which is usually too small because user also uses driver to do some computation. In local mode, executor memory setting is ignored while only driver memory is used, which provides more incentive to increase the default driver memory.

      I suggest

      1. 2GB in local mode and warn users if executor memory is set a bigger value
      2. same as worker memory on an EC2 standalone server

      Attachments

        Issue Links

          Activity

            People

              ilganeli Ilya Ganelin
              mengxr Xiangrui Meng
              Votes:
              1 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: