I think I was able to reproduce what you done: You first searched 13K entries and they were displayed in the search result editor, then you exported them, right?
Perfoming a search with 10K results and displaying them in the search result editor costs about 50MB memory. The search results are cached in memory as long as your connection is opened. 5kB for one entry sounds much, it consists of our internal object model (entry, DN, RDN, attributes, values, parent-child relationship) and the UI objects to display our object model (Table, Rows, Column, Fonts)
Exporting 10K entries to Excel costs even about 50MB memory. We use the Apache POI library for that and we must create the excel file in memory.
Exporting to LDIF or CSV is cheaper because each entry received from the server is immediately streamed to the file. (With LDIF the CPU usage is too high, need to check that...)
So 40MB (Studio/Eclipse footprint) + 50MB (10000 search results) + 50MB (Excel export) is more the 128MB default heap size.
So what I could suggest (if appropriate)
- you already increased heap memory
- only perform small searches within Studio, use the count limit and/or paged search
- If you need to export large data
- use CSV (you already do) or LDIF
- if you need excel first close all connections (to flush caches), open the right connection, run the export without performing a search
Hm, perhaps we should start a new process for each search and export, like g**gle does with chr*me