Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-31908

Spark UI shows wrong driver memory configurations if driver memory is provided at run time

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 2.4.3
    • None
    • Web UI
    • None

    Description

      I discovered that in cluster mode , when driver memory is provided via spark.driver.memory configuration at run time after creating spark session, spark doesn't pick this configurations at run time as application master is already launched by that time and picks the default spark driver memory configuration(1GB).
      However, on spark UI page, in environment tab, it still shows driver memory as the value passed via configurations at run time, which makes identifying and debugging this scenario more difficult. Driver memory should be shown as the value which spark is actually using in job.

      Attachments

        Activity

          People

            Unassigned Unassigned
            rkthe1 Rahul Kumar
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated: