The class org.apache.log4j.jdbc.JDBCAppender in Log4j 1.2.7 (and probably earlier too) leaks memory. In the method flushBuffer() values are added to the ArrayList removes (removes.add(logEvent)), but this ArrayList is never cleared or values removed. After some time this will end up with an OutOfMemoryError. In my example of flushBuffer() below there will be no memory leak. Besides this we have some ideas about improving performance for user threads by adding a thread that handles the database writing. public void append(LoggingEvent event) { buffer.add(event); if (buffer.size() >= bufferSize) logThread.wakeup(); } public synchronized void flushBuffer() { LoggingEvent logEvent = null; while (buffer.size() >0) { try { logEvent = (LoggingEvent)buffer.remove(0); } catch(Exception ignore){ break; } try { String sql = getLogStatement(logEvent); execute(sql); } catch (SQLException sqle) { // Unable to store LogEvent i database, put it back in buffer. if (logEvent != null) buffer.add(logEvent); // I'm not sure this is a good idea errorHandler.error("Failed to excute sql", sqle, ErrorCode.FLUSH_FAILURE); } } }
This is a duplicate of earlier reported bug #14827, which will be addressed in the v1.2.8 release. If you are interested in improving JDBCAppender, I urge you to become involved with its development on the log4j-dev email list. Thanks. *** This bug has been marked as a duplicate of 14827 ***