Uploaded image for project: 'Apache Avro'
  1. Apache Avro
  2. AVRO-1111

Malformed data can cause OutOfMemoryError in Avro IPC

VotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Fixed
    • 1.6.3
    • 1.7.2
    • java
    • None
    • Reviewed

    Description

      If the data that comes in through the Netty channel buffer is not framed correctly/is not valid Avro data, then the incoming data can cause arbitrarily large array lists to be created, causing OutOfMemoryError.

      The relevant code(org.apache.avro.ipc.NettyTransportCodec):

      private boolean decodePackHeader(ChannelHandlerContext ctx, Channel channel,
      ChannelBuffer buffer) throws Exception {
      if (buffer.readableBytes()<8)

      { return false; }

      int serial = buffer.readInt();
      listSize = buffer.readInt();
      dataPack = new NettyDataPack(serial, new ArrayList<ByteBuffer>(listSize));
      return true;
      }

      If the buffer does not have valid Avro data, the listSize variable can have arbitrary values, causing massive ArrayLists to be created, leading to OutOfMemoryErrors.

      Attachments

        1. AVRO-1111-1.patch
          3 kB
          Mike Percy
        2. AVRO-1111-2.patch
          3 kB
          Mike Percy

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            mpercy Mike Percy
            hshreedharan Hari Shreedharan
            Votes:
            0 Vote for this issue
            Watchers:
            9 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment