Uploaded image for project: 'Zeppelin'
  1. Zeppelin
  2. ZEPPELIN-295

Property "spark.executor.memory" doesn't have effect

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 0.6.0
    • None
    • Core
    • None
    • external Spark 1.4.1 standalone cluster, OSX 10.10.5, Java 7. Zeppelin built from sources b4b4f5521a57fd3b0902b5e3ab0e228c10b8bac5

    Description

      It appears that "spark.executor.memory" property is not passed to SparkContext when it's being created in SparkInterpreter.

      Steps to repeat:

      • edit zeppelin-env.sh to add
        export ZEPPELIN_JAVA_OPTS="-Dspark.executor.memory=1G -Dspark.cores.max=2"
      • start Zeppelin and execute some paragraphs.
      • Spark Master UI shows that the app's "Memory per node" is 512M

      After a little digging I found the code that seems to get this option from environment or system props, in SparkInterpreter:97. After editing this line to set a fixed value (and rebuilding / restarting) it still didn't work. Setting the property around line 269 didn't work either - only setting it just before the return from createSparkContext() (around line 311) actually worked, i.e. the application got the right amount of memory.

      So it seems that this property is overwritten somewhere between these lines.

      Attachments

        Activity

          People

            Unassigned Unassigned
            ab Andrzej Bialecki
            Votes:
            2 Vote for this issue
            Watchers:
            8 Start watching this issue

            Dates

              Created:
              Updated: