Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-16667

Spark driver executor dont release unused memory

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Not A Problem
    • 1.6.0
    • None
    • GraphX, Spark Core
    • None
    • Ubuntu wily 64 bits
      java 1.8
      3 slaves(4GB) 1 master(2GB) virtual machines in Vmware over i7 4th generation with 16 gb RAM)

    Description

      I'm running spark app in standalone cluster. My app create sparkContext and make many calculation with graphx over the time. To calculate, my app create new java thread and wait for it's ending signal. Betwenn any calculation, memory grows 50mb-100mb. I make a thread to be sure that any object created for calculate is destryed after calculate's end, but memory still growing. I tray stoping the sparkContext and all executor memory allocated by app is freed but my driver's memory still growing same 50m-100mb.
      My graph calculaiton include hdfs seralization of rdd and load graph from hdfs

      Spark env:
      export SPARK_MASTER_IP=master
      export SPARK_WORKER_CORES=4
      export SPARK_WORKER_MEMORY=2919m
      export SPARK_WORKER_INSTANCES=1
      export SPARK_DAEMON_MEMORY=256m
      export SPARK_WORKER_OPTS="-Dspark.worker.cleanup.enabled=true -Dspark.worker.cleanup.interval=10"
      That are my only configurations

      Attachments

        Activity

          People

            Unassigned Unassigned
            LAHA Luis Angel Hernández Acosta
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: