Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-15524

BytesWritable causes OOME when array size reaches Integer.MAX_VALUE

    Details

    • Type: Bug
    • Status: Open
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: io
    • Labels:
      None

      Description

      BytesWritable.setSize uses Integer.MAX_VALUE to initialize the internal array.  On my environment, this causes an OOME

      Exception in thread "main" java.lang.OutOfMemoryError: Requested array size exceeds VM limit
      

      byte[Integer.MAX_VALUE-2] must be used to prevent this error.

      Tested on OSX and CentOS 7 using Java version 1.8.0_131.

      I noticed that java.util.ArrayList contains the following

      /**
       * The maximum size of array to allocate.
       * Some VMs reserve some header words in an array.
       * Attempts to allocate larger arrays may result in
       * OutOfMemoryError: Requested array size exceeds VM limit
       */
      private static final int MAX_ARRAY_SIZE = Integer.MAX_VALUE - 8;
      

       

      BytesWritable.setSize should use something similar to prevent an OOME from occurring.

       

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                Unassigned
                Reporter:
                jesmith3 Joseph Smith
              • Votes:
                1 Vote for this issue
                Watchers:
                4 Start watching this issue

                Dates

                • Created:
                  Updated: