Uploaded image for project: 'Flume'
  1. Flume
  2. FLUME-2669

Flume Appender for log4j 1.x - Support for event batching

    XMLWordPrintableJSON

Details

    • New Feature
    • Status: Open
    • Major
    • Resolution: Unresolved
    • None
    • None
    • None
    • None

    Description

      A better support for Flume in the log4j 1.x appender may be beneficial.
      There is support for load-balancing with back-off and transparent failover in it already. What it needs is the ability to batch a number of logging events together in order to send them over the wire in one go.
      With a blocking queue in Java 5, it will be "fairly easy" to implement the following logic:
      1) the appender is initialized with a blocking queue on AppenderSkeleton.activateOptions() and will be the producer for it - on every call to AppenderSkeleton.append() it will add a logging event to the queue then return.
      2) a consumer thread is also brought up on AppenderSkeleton.activateOptions() and the same queue is passed to it. The thread will continuously peek an object from the queue and store it in a buffer. If no objects on the queue, the thread will block. Once the buffer is full, it will call RpcClient.appendBatch() then free up the buffer.
      3) in case an error occurs while sending the batch to a remote Flume agent (and the RPC client fails to deliver it having tried all the agents), either an exception is thrown (safe mode) or the events are silently dropped (unsafe mode).
      4) size of the in-memory buffer / size of the batch as well as size of the blocking queue should be made configurable for greater flexibility

      Attachments

        Activity

          People

            Unassigned Unassigned
            neerjakhattar Neerja Khattar
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated: