Uploaded image for project: 'Hadoop Map/Reduce'
  1. Hadoop Map/Reduce
  2. MAPREDUCE-6907

Hadoop "Spill failed" with custom ArrayWritable and data type - "java.lang.Exception: java.lang.NegativeArraySizeException"

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • 2.7.3
    • None
    • None
    • None
    • Ubuntu 16.04 (GNOME Desktop) on Dell Inspiron 7352, i7 5500U, 8GB RAM, 512GB SSD

    • Important

    Description

      When I run a MapReduce program which uses a custom ArrayWritable based on a custom data type, I get this error:

      java.io.IOException: Spill failed
      at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.checkSpillException(MapTask.java:1562)
      at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1471)
      at org.apache.hadoop.mapred.MapTask$NewOutputCollector.close(MapTask.java:723)
      at org.apache.hadoop.mapred.MapTask.closeQuietly(MapTask.java:2019)
      at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:797)
      at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
      at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
      at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
      at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      at java.lang.Thread.run(Thread.java:748)
      Caused by: java.lang.NegativeArraySizeException
      at org.apache.hadoop.io.ArrayWritable.readFields(ArrayWritable.java:93)
      at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:71)
      at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42)
      at org.apache.hadoop.mapreduce.task.ReduceContextImpl.nextKeyValue(ReduceContextImpl.java:146)
      at org.apache.hadoop.mapreduce.task.ReduceContextImpl.nextKey(ReduceContextImpl.java:121)
      at org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer$Context.nextKey(WrappedReducer.java:302)
      at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:170)
      at org.apache.hadoop.mapred.Task$NewCombinerRunner.combine(Task.java:1688)
      at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1637)
      at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.access$900(MapTask.java:876)
      at org.apache.hadoop.mapred.MapTask$MapOutputBuffer$SpillThread.run(MapTask.java:1532)
      17/06/26 16:41:54 INFO mapred.LocalJobRunner: map task executor complete.
      17/06/26 16:41:54 WARN mapred.LocalJobRunner: job_local680262639_0001
      java.lang.Exception: java.io.IOException: Spill failed
      at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
      at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522)
      Caused by: java.io.IOException: Spill failed
      at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.checkSpillException(MapTask.java:1562)
      at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.access$300(MapTask.java:876)
      at org.apache.hadoop.mapred.MapTask$MapOutputBuffer$Buffer.write(MapTask.java:1372)
      at org.apache.hadoop.mapred.MapTask$MapOutputBuffer$Buffer.write(MapTask.java:1349)
      at java.io.DataOutputStream.writeShort(DataOutputStream.java:167)
      at edu.nyu.cusp.umg.mapreduce.houghplane.CellWritable.write(CellWritable.java:55)
      at org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:98)
      at org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:82)
      at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1149)
      at org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:715)
      at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
      at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
      at edu.nyu.cusp.umg.mapreduce.houghplane.HoughTransformRasterCombiner$HoughTransformMapper.writeFull(HoughTransformRasterCombiner.java:253)
      at edu.nyu.cusp.umg.mapreduce.houghplane.HoughTransformRasterCombiner$HoughTransformMapper.map(HoughTransformRasterCombiner.java:158)
      at edu.nyu.cusp.umg.mapreduce.houghplane.HoughTransformRasterCombiner$HoughTransformMapper.map(HoughTransformRasterCombiner.java:59)
      at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
      at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
      at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
      at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243)
      at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
      at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      at java.lang.Thread.run(Thread.java:748)
      Caused by: java.lang.NegativeArraySizeException
      at org.apache.hadoop.io.ArrayWritable.readFields(ArrayWritable.java:93)
      at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:71)
      at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42)
      at org.apache.hadoop.mapreduce.task.ReduceContextImpl.nextKeyValue(ReduceContextImpl.java:146)
      at org.apache.hadoop.mapreduce.task.ReduceContextImpl.nextKey(ReduceContextImpl.java:121)
      at org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer$Context.nextKey(WrappedReducer.java:302)
      at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:170)
      at org.apache.hadoop.mapred.Task$NewCombinerRunner.combine(Task.java:1688)
      at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1637)
      at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.access$900(MapTask.java:876)
      at org.apache.hadoop.mapred.MapTask$MapOutputBuffer$SpillThread.run(MapTask.java:1532)

      When I used Google I saw this is a bug with MapReduce. It says this bug was "fixed" but apparently not for a custom ArrayWritable with a WritableComparable. Is this a problem with my code or with Hadoop?

      Attachments

        Activity

          People

            Unassigned Unassigned
            neelc Neel Chauhan
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated: