Details
Description
Sometimes the test case TestFixedLengthInputFormat#testFormatCompressedIn can fail with the following error:
java.lang.OutOfMemoryError: Requested array size exceeds VM limit at org.apache.hadoop.mapred.TestFixedLengthInputFormat.runRandomTests(TestFixedLengthInputFormat.java:322) at org.apache.hadoop.mapred.TestFixedLengthInputFormat.testFormatCompressedIn(TestFixedLengthInputFormat.java:90)
Root cause: under special circumstances, the following line can return a huge number:
// Test a split size that is less than record len numSplits = (int)(fileSize/Math.floor(recordLength/2));
For example, let seed be 2026428718. This causes recordLength to be 1 at iteration 19. Math.floor() returns negative Infinity, which becomes positve infinity after the divison. Casting it to int yields Integer.MAX_VALUE. Eventually we get an OOME because the test wants to create a huge InputSplit array.