Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-17554

spark.executor.memory option not working

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Invalid
    • None
    • None
    • Spark Core
    • None

    Description

      Hi,

      I am new to spark, I have spark cluster with 5 slaves(Each one have 2 cores and 4g RAM). In spark cluster dashboard I am seeing memory per node is 1gb, I tried to increase it to 2g by using this parameter spark.executor.memory 2g in defaults.conf but it didn't work. I want to increase the memory. Please let me know how to do that.

      Attachments

        Activity

          People

            Unassigned Unassigned
            sankar.mittapally@creditvidya.com Sankar Mittapally
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: