Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-15524

BytesWritable causes OOME when array size reaches Integer.MAX_VALUE

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 3.4.0
    • Component/s: io
    • Labels:
      None

      Description

      BytesWritable.setSize uses Integer.MAX_VALUE to initialize the internal array.  On my environment, this causes an OOME

      Exception in thread "main" java.lang.OutOfMemoryError: Requested array size exceeds VM limit
      

      byte[Integer.MAX_VALUE-2] must be used to prevent this error.

      Tested on OSX and CentOS 7 using Java version 1.8.0_131.

      I noticed that java.util.ArrayList contains the following

      /**
       * The maximum size of array to allocate.
       * Some VMs reserve some header words in an array.
       * Attempts to allocate larger arrays may result in
       * OutOfMemoryError: Requested array size exceeds VM limit
       */
      private static final int MAX_ARRAY_SIZE = Integer.MAX_VALUE - 8;
      

       

      BytesWritable.setSize should use something similar to prevent an OOME from occurring.

       

        Attachments

          Activity

            People

            • Assignee:
              jesmith3 Joseph Smith
              Reporter:
              jesmith3 Joseph Smith
            • Votes:
              3 Vote for this issue
              Watchers:
              7 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: