Hadoop Map/Reduce
  1. Hadoop Map/Reduce
  2. MAPREDUCE-1239

Mapreduce test build is broken after HADOOP-5107

    Details

    • Type: Bug Bug
    • Status: Closed
    • Priority: Blocker Blocker
    • Resolution: Fixed
    • Affects Version/s: 0.21.0, 0.22.0
    • Fix Version/s: 0.21.0
    • Component/s: build
    • Labels:
      None
    • Hadoop Flags:
      Reviewed
    1. m-1239.patch
      7 kB
      Owen O'Malley
    2. mapred-1239.patch
      6 kB
      Giridharan Kesavan

      Activity

      Hide
      Vinod Kumar Vavilapalli added a comment -

      The build fails because of absent dependencies of core/hdfs jars in the contrib projects. For e.g, try to run TestCapacityScheduler.

      Show
      Vinod Kumar Vavilapalli added a comment - The build fails because of absent dependencies of core/hdfs jars in the contrib projects. For e.g, try to run TestCapacityScheduler .
      Hide
      Giridharan Kesavan added a comment -

      this fixes the dependencies for test-contrib target, but still failes some testcases for missing mapred queues config.

      Sreekant said he would take a look at the failing testcases.

      Show
      Giridharan Kesavan added a comment - this fixes the dependencies for test-contrib target, but still failes some testcases for missing mapred queues config. Sreekant said he would take a look at the failing testcases.
      Hide
      Sreekanth Ramakrishnan added a comment -

      Checked the streaming and fair-scheduler test cases. Test cases in fairscheduler fail with too many open files message apart from that test streaming passes fully.

      Show
      Sreekanth Ramakrishnan added a comment - Checked the streaming and fair-scheduler test cases. Test cases in fairscheduler fail with too many open files message apart from that test streaming passes fully.
      Hide
      Owen O'Malley added a comment -

      Is it related to MAPREDUCE-1241?

      Show
      Owen O'Malley added a comment - Is it related to MAPREDUCE-1241 ?
      Hide
      Owen O'Malley added a comment -

      Why do the contrib components need the whole list of indirect dependencies? That is not the way that maven dependencies are supposed to work.

      Show
      Owen O'Malley added a comment - Why do the contrib components need the whole list of indirect dependencies? That is not the way that maven dependencies are supposed to work.
      Hide
      Sreekanth Ramakrishnan added a comment -

      The issue on Giri's box could have been due to old mapred-queues.xml i.e. MAPREDUCE-1241 , but what I observed when fairscheduler test was run that the TestFairScheduler was timing out with following exception.

          [junit] 09/11/25 16:34:08 INFO mapred.JobTracker: JobTracker up at: 32801
          [junit] 09/11/25 16:34:08 INFO mapred.JobTracker: JobTracker webserver: 44348
          [junit] 09/11/25 16:34:08 INFO mapred.JobTracker: Cleaning up the system directory
          [junit] 09/11/25 16:34:08 INFO mapred.JobTracker: problem cleaning system directory: file:/tmp/hadoop-sreerama/mapred/system
          [junit] java.io.IOException: Cannot run program "chmod": java.io.IOException: error=24, Too many open files
          [junit] 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:474)
          [junit] 	at org.apache.hadoop.util.Shell.runCommand(Shell.java:188)
          [junit] 	at org.apache.hadoop.util.Shell.run(Shell.java:170)
          [junit] 	at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:363)
          [junit] 	at org.apache.hadoop.util.Shell.execCommand(Shell.java:449)
          [junit] 	at org.apache.hadoop.util.Shell.execCommand(Shell.java:432)
          [junit] 	at org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:540)
          [junit] 	at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:532)
          [junit] 	at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:281)
          [junit] 	at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:295)
          [junit] 	at org.apache.hadoop.mapred.JobTracker.<init>(JobTracker.java:1477)
          [junit] 	at org.apache.hadoop.mapred.JobTracker.<init>(JobTracker.java:1306)
          [junit] 	at org.apache.hadoop.mapred.JobTracker.<init>(JobTracker.java:1299)
          [junit] 	at org.apache.hadoop.mapred.UtilsForTests.getJobTracker(UtilsForTests.java:712)
          [junit] 	at org.apache.hadoop.mapred.TestFairScheduler.testPoolAssignment(TestFairScheduler.java:2546)
          [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
          [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
          [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
          [junit] 	at java.lang.reflect.Method.invoke(Method.java:616)
          [junit] 	at junit.framework.TestCase.runTest(TestCase.java:168)
          [junit] 	at junit.framework.TestCase.runBare(TestCase.java:134)
          [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:110)
          [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:128)
          [junit] 	at junit.framework.TestResult.run(TestResult.java:113)
          [junit] 	at junit.framework.TestCase.run(TestCase.java:124)
          [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:232)
          [junit] 	at junit.framework.TestSuite.run(TestSuite.java:227)
          [junit] 	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:79)
          [junit] 	at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
          [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
          [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
          [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
          [junit] Caused by: java.io.IOException: java.io.IOException: error=24, Too many open files
          [junit] 	at java.lang.UNIXProcess.<init>(UNIXProcess.java:164)
          [junit] 	at java.lang.ProcessImpl.start(ProcessImpl.java:81)
          [junit] 	at java.lang.ProcessBuilder.start(ProcessBuilder.java:467)
          [junit] 	... 31 more
      
      Show
      Sreekanth Ramakrishnan added a comment - The issue on Giri's box could have been due to old mapred-queues.xml i.e. MAPREDUCE-1241 , but what I observed when fairscheduler test was run that the TestFairScheduler was timing out with following exception. [junit] 09/11/25 16:34:08 INFO mapred.JobTracker: JobTracker up at: 32801 [junit] 09/11/25 16:34:08 INFO mapred.JobTracker: JobTracker webserver: 44348 [junit] 09/11/25 16:34:08 INFO mapred.JobTracker: Cleaning up the system directory [junit] 09/11/25 16:34:08 INFO mapred.JobTracker: problem cleaning system directory: file:/tmp/hadoop-sreerama/mapred/system [junit] java.io.IOException: Cannot run program "chmod": java.io.IOException: error=24, Too many open files [junit] at java.lang.ProcessBuilder.start(ProcessBuilder.java:474) [junit] at org.apache.hadoop.util.Shell.runCommand(Shell.java:188) [junit] at org.apache.hadoop.util.Shell.run(Shell.java:170) [junit] at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:363) [junit] at org.apache.hadoop.util.Shell.execCommand(Shell.java:449) [junit] at org.apache.hadoop.util.Shell.execCommand(Shell.java:432) [junit] at org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:540) [junit] at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:532) [junit] at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:281) [junit] at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:295) [junit] at org.apache.hadoop.mapred.JobTracker.<init>(JobTracker.java:1477) [junit] at org.apache.hadoop.mapred.JobTracker.<init>(JobTracker.java:1306) [junit] at org.apache.hadoop.mapred.JobTracker.<init>(JobTracker.java:1299) [junit] at org.apache.hadoop.mapred.UtilsForTests.getJobTracker(UtilsForTests.java:712) [junit] at org.apache.hadoop.mapred.TestFairScheduler.testPoolAssignment(TestFairScheduler.java:2546) [junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) [junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) [junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) [junit] at java.lang.reflect.Method.invoke(Method.java:616) [junit] at junit.framework.TestCase.runTest(TestCase.java:168) [junit] at junit.framework.TestCase.runBare(TestCase.java:134) [junit] at junit.framework.TestResult$1.protect(TestResult.java:110) [junit] at junit.framework.TestResult.runProtected(TestResult.java:128) [junit] at junit.framework.TestResult.run(TestResult.java:113) [junit] at junit.framework.TestCase.run(TestCase.java:124) [junit] at junit.framework.TestSuite.runTest(TestSuite.java:232) [junit] at junit.framework.TestSuite.run(TestSuite.java:227) [junit] at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:79) [junit] at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39) [junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420) [junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911) [junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768) [junit] Caused by: java.io.IOException: java.io.IOException: error=24, Too many open files [junit] at java.lang.UNIXProcess.<init>(UNIXProcess.java:164) [junit] at java.lang.ProcessImpl.start(ProcessImpl.java:81) [junit] at java.lang.ProcessBuilder.start(ProcessBuilder.java:467) [junit] ... 31 more
      Hide
      Owen O'Malley added a comment -

      On my machine all of the contribs passed with this version of the patch. I had to add hadoop-common-test to sqoop. Otherwise, it had a failure.

      Show
      Owen O'Malley added a comment - On my machine all of the contribs passed with this version of the patch. I had to add hadoop-common-test to sqoop. Otherwise, it had a failure.
      Hide
      Giridharan Kesavan added a comment -

      tested m-1239.patch and hit the same exception as sreekant mentioned.
      Otherwise this patch looks good.
      +1

      Show
      Giridharan Kesavan added a comment - tested m-1239.patch and hit the same exception as sreekant mentioned. Otherwise this patch looks good. +1
      Hide
      Giridharan Kesavan added a comment -

      let me commit this to fix the trunk build failure and file a jira for the fairscheduler test failure "TestFairScheduler"

      Show
      Giridharan Kesavan added a comment - let me commit this to fix the trunk build failure and file a jira for the fairscheduler test failure "TestFairScheduler"
      Hide
      Vinod Kumar Vavilapalli added a comment -

      MAPREDUCE-1245 is created for the TestFairScheduler failure.

      Show
      Vinod Kumar Vavilapalli added a comment - MAPREDUCE-1245 is created for the TestFairScheduler failure.
      Hide
      Giridharan Kesavan added a comment -

      I just committed this to trunk, thanks Owen

      Show
      Giridharan Kesavan added a comment - I just committed this to trunk, thanks Owen
      Hide
      Vinod Kumar Vavilapalli added a comment -

      This has to be fixed in 0.21 too as HADOOP-5107 was pulled into 0.21 branch.

      Show
      Vinod Kumar Vavilapalli added a comment - This has to be fixed in 0.21 too as HADOOP-5107 was pulled into 0.21 branch.
      Hide
      Hudson added a comment -

      Integrated in Hadoop-Mapreduce-trunk-Commit #134 (See http://hudson.zones.apache.org/hudson/job/Hadoop-Mapreduce-trunk-Commit/134/)

      Show
      Hudson added a comment - Integrated in Hadoop-Mapreduce-trunk-Commit #134 (See http://hudson.zones.apache.org/hudson/job/Hadoop-Mapreduce-trunk-Commit/134/ )
      Hide
      Owen O'Malley added a comment -

      I committed this to 0.21 also.

      Show
      Owen O'Malley added a comment - I committed this to 0.21 also.
      Hide
      Hudson added a comment -

      Integrated in Hadoop-Mapreduce-trunk-Commit #135 (See http://hudson.zones.apache.org/hudson/job/Hadoop-Mapreduce-trunk-Commit/135/)
      . Fix contrib components build dependencies.
      (Giridharan Kesavan and omalley)
      Move the change message into 0.21.0.

      Show
      Hudson added a comment - Integrated in Hadoop-Mapreduce-trunk-Commit #135 (See http://hudson.zones.apache.org/hudson/job/Hadoop-Mapreduce-trunk-Commit/135/ ) . Fix contrib components build dependencies. (Giridharan Kesavan and omalley) Move the change message into 0.21.0.
      Hide
      Hudson added a comment -

      Integrated in Hadoop-Mapreduce-trunk #162 (See http://hudson.zones.apache.org/hudson/job/Hadoop-Mapreduce-trunk/162/)

      Show
      Hudson added a comment - Integrated in Hadoop-Mapreduce-trunk #162 (See http://hudson.zones.apache.org/hudson/job/Hadoop-Mapreduce-trunk/162/ )

        People

        • Assignee:
          Giridharan Kesavan
          Reporter:
          Vinod Kumar Vavilapalli
        • Votes:
          0 Vote for this issue
          Watchers:
          5 Start watching this issue

          Dates

          • Created:
            Updated:
            Resolved:

            Development