Uploaded image for project: 'Qpid JMS'
  1. Qpid JMS
  2. QPIDJMS-292

consumers may buffer vastly more messages than expected for their 'prefetch' value

VotersWatch issueWatchersLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Fixed
    • 0.22.0
    • 0.23.0
    • qpid-jms-client
    • None

    Description

      Consumers may buffer vastly more messages than expected for their 'prefetch' value.

      The consumer link credit handling is toggled via the 'prefetch' settings, which aims to limit the number of messages locally buffered for the consumer, protecting the consumer and enabling the server to better distribute between multiple consumers as and when appropriate, particularly ones that arrive later than others.

      The link credit handling however has not been correctly accounting for already buffered but not yet consumed messages, but rather only the credit itself at the point of inspection. As a result, if the credit is sufficiently (>=70%) utilised between processing of messages, more link credit is granted to top up to the 'prefetch' value, This potentially allows it to buffer more than <prefetch> messages, and also grant more credit even after it has already done so. If there is a signficant backlog of messages available and the broker can feed them to the client quickly enough, and consumption does not similarly keep up, this can repeat to the extent of buffering as many messages as are available.

      The only obvious workarounds would be reducing the prefetch to 0 or 1.

      Attachments

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            robbie Robbie Gemmell
            robbie Robbie Gemmell
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment