Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
None
-
None
-
None
Description
Even though they may no longer be references, Java only cleans up direct buffers on full gc. If there is enough heap available, a full GC is never hit and these buffers are leaked. Hadoop keeps creating new compressors instead of using the pools causing a leak - which is a bug in itself which is being addressed by HADOOP-10591
Attachments
Attachments
Issue Links
- is broken by
-
HADOOP-10591 Compression codecs must used pooled direct buffers or deallocate direct buffers when stream is closed
- Closed
- is duplicated by
-
FLUME-2356 HDFSCompressedDataStream memory leak
- Resolved