Details
-
Bug
-
Status: Closed
-
Resolution: Fixed
-
0.20.5
-
None
-
None
-
Operating System: All
Platform: All
-
32354
Description
If you set a large scale factor say 400% on the AWTRenderer and render the first
page it works but when switching to the next page it fails with an OutOfMemoryError.
The problem is in the render(Page) method where it allocates a new BufferedImage
and graphics. During the BufferedImage allocation the the previous pageImage and
graphics can not be garbage collected because they are still referenced.
Therefor it requires enough memory for two BufferedImages.
This can be fixed by setting the pageImage and graphics variables to null before
allocating a new BufferedImage.
Also after the page has been rendered the graphics variable is not needed and
can be garbage collected.
Heres my proposed fix:
public void render(Page page) throws IOException {
pageImage = null;
// do work ...
graphics = null;
}
I checked the current CVS to see if this has been addressed but it now generates
BufferedImages for each page and stores them in a bufferedImageList. This would
surely require a large amount of memory!
I personally prefer the Java2DHook style thing in the
http://nagoya.apache.org/wiki/apachewiki.cgi?FOPAvalonization/AltAPIProposalJM
where the user can supply the Graphics2D to render to so they can manage their
own memory.