Uploaded image for project: 'HBase'
  1. HBase
  2. HBASE-26527

ArrayIndexOutOfBoundsException in KeyValueUtil.copyToNewKeyValue()

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.2.7, 3.0.0-alpha-2
    • 2.5.0, 3.0.0-alpha-2, 2.4.9
    • wal
    • None
    • Reviewed

    Description

      While investigating a Phoenix crash, I've found a possible problem in KeyValueUtil.

      When using Phoenix, we need configure (at least for older versions) org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec as a WAL codec in HBase.

      This codec will eventually serialize standard (not phoenix specifc WAL entries) to the WAL file, and internally converts the Cell objects to KeyValue objects, by building a new byte[].

      This fails with an ArrayIndexOutOfBoundsException, because the we allocate a byte[] the size of Cell.getSerializedSize(), and it seems that we are processing a Cell that does not actually serialize the column family and later fields.
      However, we are building a traditional KeyValue object for serialization, which does serialize them, hence we run out of bytes.

      I think that since we are writing a KeyValue, we should not rely of the getSerializedSize() method of the source cell, but rather calculate the backing array size based on how KeyValue expects its data to be serialized.

      The stack trace for reference:

      java.lang.ArrayIndexOutOfBoundsException: 9787
              at org.apache.hadoop.hbase.util.Bytes.putByte(Bytes.java:502)
              at org.apache.hadoop.hbase.KeyValueUtil.appendKeyTo(KeyValueUtil.java:142)
              at org.apache.hadoop.hbase.KeyValueUtil.appendToByteArray(KeyValueUtil.java:156)
              at org.apache.hadoop.hbase.KeyValueUtil.copyToNewByteArray(KeyValueUtil.java:133)
              at org.apache.hadoop.hbase.KeyValueUtil.copyToNewKeyValue(KeyValueUtil.java:97)
              at org.apache.phoenix.util.PhoenixKeyValueUtil.maybeCopyCell(PhoenixKeyValueUtil.java:214)
              at org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec$IndexKeyValueEncoder.write(IndexedWALEditCodec.java:218)
              at org.apache.hadoop.hbase.regionserver.wal.ProtobufLogWriter.append(ProtobufLogWriter.java:59)
              at org.apache.hadoop.hbase.regionserver.wal.FSHLog.doAppend(FSHLog.java:294)
              at org.apache.hadoop.hbase.regionserver.wal.FSHLog.doAppend(FSHLog.java:65)
              at org.apache.hadoop.hbase.regionserver.wal.AbstractFSWAL.appendEntry(AbstractFSWAL.java:931)
              at org.apache.hadoop.hbase.regionserver.wal.FSHLog$RingBufferEventHandler.append(FSHLog.java:1075)
              at org.apache.hadoop.hbase.regionserver.wal.FSHLog$RingBufferEventHandler.onEvent(FSHLog.java:964)
              at org.apache.hadoop.hbase.regionserver.wal.FSHLog$RingBufferEventHandler.onEvent(FSHLog.java:873)
              at com.lmax.disruptor.BatchEventProcessor.run(BatchEventProcessor.java:129)
              at java.lang.Thread.run(Thread.java:748)
      

      Note that I am still not sure exactly what triggers this bug, one possibility is org.apache.hadoop.hbase.ByteBufferKeyOnlyKeyValue

      Attachments

        Issue Links

          Activity

            People

              stoty Istvan Toth
              stoty Istvan Toth
              Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: