Uploaded image for project: 'Kafka'
  1. Kafka
  2. KAFKA-6980

Recommended MaxDirectMemorySize for consumers

    XMLWordPrintableJSON

Details

    • Wish
    • Status: Open
    • Minor
    • Resolution: Unresolved
    • 0.10.2.0
    • None
    • consumer, documentation
    • CloudFoundry

    Description

      We are observing that when MaxDirectMemorySize is set too low, our Kafka consumer threads are failing and encountering the following exception:

      java.lang.OutOfMemoryError: Direct buffer memory

      Is there a way to estimate how much direct memory is required for optimal performance?  In the documentation, it is suggested that the amount of memory required is  [Number of Partitions * max.partition.fetch.bytes]

      When we pick a value slightly above that, we no longer encounter the error, but if we double or triple the number, our throughput improves drastically.  So we are wondering if there is another setting or parameter to consider?

       

       

       

       

      Attachments

        Activity

          People

            Unassigned Unassigned
            jlu717 John Lu
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated: