Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-22292

Add spark.mem.max to limit the amount of memory received from Mesos

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Incomplete
    • 2.2.0
    • None
    • Mesos
    • Spark with Mesos

    Description

      To limit the amount of resources a spark job accept from Mesos, currently we can only use `spark.cores.max` to limit in terms of cpu cores.
      However, when we have big memory executors, it would consume all the resources.

      Attachments

        Activity

          People

            Unassigned Unassigned
            windkithk windkithk
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: