Uploaded image for project: 'Avro'
  1. Avro
  2. AVRO-1111

Malformed data can cause OutOfMemoryError in Avro IPC

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 1.6.3
    • Fix Version/s: 1.7.2
    • Component/s: java
    • Labels:
      None
    • Hadoop Flags:
      Reviewed

      Description

      If the data that comes in through the Netty channel buffer is not framed correctly/is not valid Avro data, then the incoming data can cause arbitrarily large array lists to be created, causing OutOfMemoryError.

      The relevant code(org.apache.avro.ipc.NettyTransportCodec):

      private boolean decodePackHeader(ChannelHandlerContext ctx, Channel channel,
      ChannelBuffer buffer) throws Exception {
      if (buffer.readableBytes()<8)

      { return false; }

      int serial = buffer.readInt();
      listSize = buffer.readInt();
      dataPack = new NettyDataPack(serial, new ArrayList<ByteBuffer>(listSize));
      return true;
      }

      If the buffer does not have valid Avro data, the listSize variable can have arbitrary values, causing massive ArrayLists to be created, leading to OutOfMemoryErrors.

        Attachments

        1. AVRO-1111-1.patch
          3 kB
          Mike Percy
        2. AVRO-1111-2.patch
          3 kB
          Mike Percy

          Issue Links

            Activity

              People

              • Assignee:
                mpercy Mike Percy
                Reporter:
                hshreedharan Hari Shreedharan
              • Votes:
                0 Vote for this issue
                Watchers:
                9 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: