Details

    • Type: Bug Bug
    • Status: Closed
    • Priority: Blocker Blocker
    • Resolution: Fixed
    • Affects Version/s: 0.14.0
    • Fix Version/s: 0.14.0
    • Component/s: None
    • Labels:
      None

      Description

      I recently started running the unit tests with -Dcompile.c++=yes so that pipes is compiled and it's unit tests are run.

      TestPipes.testPipes consistently fails on Linux with
      junit.framework.AssertionFailedError: got exception: java.io.IOException: Job failed!
      at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:625)
      at org.apache.hadoop.mapred.pipes.Submitter.submitJob(Submitter.java:250)
      at org.apache.hadoop.mapred.pipes.Submitter.main(Submitter.java:404)
      at org.apache.hadoop.mapred.pipes.TestPipes.runNonPipedProgram(TestPipes.java:173)
      at org.apache.hadoop.mapred.pipes.TestPipes.testPipes(TestPipes.java:69)

      at org.apache.hadoop.mapred.pipes.TestPipes.runNonPipedProgram(TestPipes.java:180)
      at org.apache.hadoop.mapred.pipes.TestPipes.testPipes(TestPipes.java:69)

      which is perhaps caused by
      2007-08-14 02:10:18,831 INFO mapred.TaskRunner (ReduceTaskRunner.java:close(45)) - task_200708140209_0003_r_000000_0 done; removing files.
      2007-08-14 02:10:18,841 INFO mapred.TaskInProgress (TaskInProgress.java:updateStatus(371)) - Error from task_200708140209_0003_r_000000_0: java.io.IOException: pipe child exception
      at org.apache.hadoop.mapred.pipes.Application.abort(Application.java:130)
      at org.apache.hadoop.mapred.pipes.PipesReducer.close(PipesReducer.java:103)
      at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:328)
      at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:1778)
      Caused by: java.net.SocketException: Broken pipe
      at java.net.SocketOutputStream.socketWrite0(Native Method)
      at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:92)
      at java.net.SocketOutputStream.write(SocketOutputStream.java:115)
      at java.io.DataOutputStream.writeByte(DataOutputStream.java:136)
      at org.apache.hadoop.io.WritableUtils.writeVLong(WritableUtils.java:278)
      at org.apache.hadoop.io.WritableUtils.writeVInt(WritableUtils.java:258)
      at org.apache.hadoop.mapred.pipes.BinaryProtocol.close(BinaryProtocol.java:281)
      at org.apache.hadoop.mapred.pipes.PipesReducer.close(PipesReducer.java:95)
      ... 2 more

        Activity

        Hide
        Doug Cutting added a comment -

        I just committed this. Thanks, Owen!

        Show
        Doug Cutting added a comment - I just committed this. Thanks, Owen!
        Hide
        Owen O'Malley added a comment -

        This patch fixes the wordcount-nonpipe example and the TestPipes code to deal with the fact that the input and output paths now include schemas. Clearly they need to be removed before they are passed to the C++ file utils. smile

        Show
        Owen O'Malley added a comment - This patch fixes the wordcount-nonpipe example and the TestPipes code to deal with the fact that the input and output paths now include schemas. Clearly they need to be removed before they are passed to the C++ file utils. smile
        Hide
        Owen O'Malley added a comment -

        Hmm, the problem is that the input file is being converted to a string and is picking up the schema now. So instead of trying to open "/foo/part1" it is trying to open "file:/foo/part1" is nIn particular, I get in the stderr file:

        Hadoop Pipes Exception: failed to open file:/home/oom/work/eclipse/hadoop/build/test/data/pipes/input/part1 at /home/oom/work/eclipse/hadoop/src/examples/pipes/impl/wordcount-nopipe.cc:49 in WordCountReader::WordCountReader(HadoopPipes::MapContext&)

        The fix is going to be in the wordcount nopipe example.

        Show
        Owen O'Malley added a comment - Hmm, the problem is that the input file is being converted to a string and is picking up the schema now. So instead of trying to open "/foo/part1" it is trying to open "file:/foo/part1" is nIn particular, I get in the stderr file: Hadoop Pipes Exception: failed to open file:/home/oom/work/eclipse/hadoop/build/test/data/pipes/input/part1 at /home/oom/work/eclipse/hadoop/src/examples/pipes/impl/wordcount-nopipe.cc:49 in WordCountReader::WordCountReader(HadoopPipes::MapContext&) The fix is going to be in the wordcount nopipe example.

          People

          • Assignee:
            Owen O'Malley
            Reporter:
            Nigel Daley
          • Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development