Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-23145

How to set Apache Spark Executor memory

    XMLWordPrintableJSON

Details

    • Question
    • Status: Resolved
    • Major
    • Resolution: Invalid
    • 2.1.0
    • None
    • EC2
    • None

    Description

      How can I increase the memory available for Apache spark executor nodes?

      I have a 2 GB file that is suitable to loading in to Apache Spark. I am running apache spark for the moment on 1 machine, so the driver and executor are on the same machine. The machine has 8 GB of memory.

      When I try count the lines of the file after setting the file to be cached in memory I get these errors:

      {{2014-10-25 22:25:12 WARN CacheManager:71 - Not enough space to cache partition rdd_1_1 in memory! Free memory is 278099801 bytes. }}

      I looked at the documentation here and set spark.executor.memory to 4g in $SPARK_HOME/conf/spark-defaults.conf

      The UI shows this variable is set in the Spark Environment. You can find screenshot here

      However when I go to the Executor tab the memory limit for my single Executor is still set to 265.4 MB. I also still get the same error.

      I tried various things mentioned here but I still get the error and don't have a clear idea where I should change the setting.

      I am running my code interactively from the spark-shell

      Attachments

        Activity

          People

            Unassigned Unassigned
            Azharuddin Azharuddin
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Time Tracking

                Estimated:
                Original Estimate - 954h
                954h
                Remaining:
                Remaining Estimate - 954h
                954h
                Logged:
                Time Spent - Not Specified
                Not Specified