Details
-
Bug
-
Status: Resolved
-
Minor
-
Resolution: Incomplete
-
2.3.1
-
None
-
Spark 2.3.1 (AWS emr-5.16.0)
Description
When debugging an OOM exception during long run of a Spark application (many iterations of the same code) I've found that generated plans occupy most of the driver memory. I'm not sure whether this is a memory leak or not, but it would be helpful if old plans could be purged from memory anyways.
Attached are screenshots of OOM heap dump opened in JVisualVM.
Attachments
Attachments
Issue Links
- is related to
-
SPARK-26103 OutOfMemory error with large query plans
- Resolved