Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 3.0.0-alpha1
    • Fix Version/s: 3.0.0-alpha1
    • Component/s: build
    • Labels:
      None
    • Target Version/s:
    • Hadoop Flags:
      Incompatible change
    • Release Note:
      Hide
      <!-- markdown -->
      * Turning on optional things from the tools directory such as S3 support can now be done in hadoop-env.sh with the HADOOP\_OPTIONAL\_TOOLS environment variable without impacting the various user-facing CLASSPATH variables.
      * The tools directory is no longer pulled in blindly for any utilities that pull it in.
      * TOOL\_PATH / HADOOP\_TOOLS\_PATH has been broken apart and replaced with HADOOP\_TOOLS\_HOME, HADOOP\_TOOLS\_DIR and HADOOP\_TOOLS\_LIB\_JARS\_DIR to be consistent with the rest of Hadoop.
      Show
      <!-- markdown --> * Turning on optional things from the tools directory such as S3 support can now be done in hadoop-env.sh with the HADOOP\_OPTIONAL\_TOOLS environment variable without impacting the various user-facing CLASSPATH variables. * The tools directory is no longer pulled in blindly for any utilities that pull it in. * TOOL\_PATH / HADOOP\_TOOLS\_PATH has been broken apart and replaced with HADOOP\_TOOLS\_HOME, HADOOP\_TOOLS\_DIR and HADOOP\_TOOLS\_LIB\_JARS\_DIR to be consistent with the rest of Hadoop.

      Description

      As hadoop-tools grows bigger and bigger, it's becoming evident that having a single directory that gets sucked in is starting to become a big burden as the number of tools grows. Let's rework this to be smarter.

      1. HADOOP-12857.00.patch
        41 kB
        Allen Wittenauer
      2. HADOOP-12857.01.patch
        42 kB
        Allen Wittenauer
      3. HADOOP-12857.02.patch
        48 kB
        Allen Wittenauer

        Issue Links

          Activity

          Hide
          aw Allen Wittenauer added a comment -

          There have been several jiras as of late (e.g, HADOOP-12556 , HADOOP-12721 just to name two) where there is growing concern about ease of use vs. performance vs. surprises. in HADOOP-12556 I offered up two potential solutions to the issue:

          • break apart hadoop-tools-dist into multiple directories. create a shell profile that sucks that functionalities entire dir.
          • keep hadoop-tools-dist as one big dir (thus making it bw compat, but still potentially messy). build a tool that creates shell profiles based upon the maven dependency trees to list the specific jars needed by that functionality.
          Show
          aw Allen Wittenauer added a comment - There have been several jiras as of late (e.g, HADOOP-12556 , HADOOP-12721 just to name two) where there is growing concern about ease of use vs. performance vs. surprises. in HADOOP-12556 I offered up two potential solutions to the issue: break apart hadoop-tools-dist into multiple directories. create a shell profile that sucks that functionalities entire dir. keep hadoop-tools-dist as one big dir (thus making it bw compat, but still potentially messy). build a tool that creates shell profiles based upon the maven dependency trees to list the specific jars needed by that functionality.
          Hide
          cnauroth Chris Nauroth added a comment -

          Just copy-pasting my comment from HADOOP-12556:

          I'm slightly in favor of option 2: keep it as one big dir. That gives an easy out if someone decides they really do want the whole world by putting share/hadoop/tools/lib/* on the classpath. OTOH, I suppose we could come up with a "whole world" shell profile that walks a more granular directory structure and gathers everything.

          In general, I really like the idea of using shell profiles to solve this problem. We still have a gap in that we don't have equivalent functionality on Windows. I have a hunch that it won't be feasible to offer all of the rich features of the full shell rewrite in cmd, but maybe we can do just enough to support classpath customization through profiles.

          Show
          cnauroth Chris Nauroth added a comment - Just copy-pasting my comment from HADOOP-12556 : I'm slightly in favor of option 2: keep it as one big dir. That gives an easy out if someone decides they really do want the whole world by putting share/hadoop/tools/lib/* on the classpath. OTOH, I suppose we could come up with a "whole world" shell profile that walks a more granular directory structure and gathers everything. In general, I really like the idea of using shell profiles to solve this problem. We still have a gap in that we don't have equivalent functionality on Windows. I have a hunch that it won't be feasible to offer all of the rich features of the full shell rewrite in cmd, but maybe we can do just enough to support classpath customization through profiles.
          Hide
          aw Allen Wittenauer added a comment - - edited

          FWIW, I've got some stupid/simple shell code that takes the output of mvn dependency:list and builds a shell profile script.

          Some random notes:

          • It currently looks for ALL of the depended upon jars in the tools dir. This is less than efficient for what are hopefully obvious reasons.
          • HADOOP-10115 pretty much means that the shell profiles will need to be built well after we've processed the hadoop-tools dir in order to know what is/isn't already bundled via hadoop-common.

          So contemplating two approaches in order to make the latter option work:

          1. Try to trigger mvn dependency:list in the build stage for those modules that need it. Push the output through the build process up until hadoop-dist gets triggered. Take that output and generate the profiles then.
          2. In hadoop-dist, run mvn dependency:list for all (except some blacklisted ones) modules under hadoop-tools (and thus effectively having mvn running mvn), and then generate profiles as in #1.

          To make matters more complicated, I've been informed over the weekend that Big Top based distributions stupidly merge all of hadoop-tools into hadoop-common's lib dir. So they'll always have the perf hit and other issues that having a flat dir structure causes.

          Show
          aw Allen Wittenauer added a comment - - edited FWIW, I've got some stupid/simple shell code that takes the output of mvn dependency:list and builds a shell profile script. Some random notes: It currently looks for ALL of the depended upon jars in the tools dir. This is less than efficient for what are hopefully obvious reasons. HADOOP-10115 pretty much means that the shell profiles will need to be built well after we've processed the hadoop-tools dir in order to know what is/isn't already bundled via hadoop-common. So contemplating two approaches in order to make the latter option work: Try to trigger mvn dependency:list in the build stage for those modules that need it. Push the output through the build process up until hadoop-dist gets triggered. Take that output and generate the profiles then. In hadoop-dist, run mvn dependency:list for all (except some blacklisted ones) modules under hadoop-tools (and thus effectively having mvn running mvn), and then generate profiles as in #1. To make matters more complicated, I've been informed over the weekend that Big Top based distributions stupidly merge all of hadoop-tools into hadoop-common's lib dir. So they'll always have the perf hit and other issues that having a flat dir structure causes.
          Hide
          aw Allen Wittenauer added a comment -

          I have some sample code working. It was very enlightening and I know what to do now. If we really do want to keep one directory, here's my current plan of attack:

          • Truly optional components (s3, azure, openstack, kafka, etc), will have a shellprofile built that users can enable by doing the necessary incantations. I'm currently thinking I might be able to add content to hadoop-env.sh at build time to actually turn these things on via a single env-var setting or one per feature. No promises. (Yes, I'm currently looking for my "Black Hat of Bash Wizardry" to make this happen.) Worst case, it'll be a "copy and rename to HADOOP_CONF_DIR".
          • With some help from Ravi Prakash to make me see the forest for the trees, I can now build shell parse-able dependency lists at build time. I have two ways I can process this: I can either store these lists in the hadoop-dist target directory or in the target directory of the actually tools+using a well-known-name+find to build the necessary shell magic at build time. I'm leaning towards the latter since that will allow mvn clean to work in hadoop-dist in an expected way, since there won't be a hidden dependency on hadoop-tools having been run before the mvn package.
          • distch, distcp, archive-logs, etc, are extremely problematic. Using shell profiles for these WILL NOT WORK since they a) aren't really optional and b) removing them from the command line tools won't really help anyone. Currently these commands load all of HADOOP_TOOLS_PATH which is awful. I want to add to libexec/ a tools directory that stores helper functions for tools jars that are required for the various subcommands. It will use similar but different code from the optional components. It will key off a different filename for the dependency list and there will need to be a contract between the helper function names and the dependency file name. (This sounds worse than what it is.)

          I wish there was a way to dynamically add subcommands to hadoop, mapred, etc, but the code just isn't quite there yet. We can do usage now, but not actually execution.

          One big question: How should this work proceed?

          1. Single patch
          2. Multiple patches with a strict commit dependency order
          3. Separate branch followed by a branch merge

          Given this work will likely be all or nothing I'm not a fan of multiple patches.

          Show
          aw Allen Wittenauer added a comment - I have some sample code working. It was very enlightening and I know what to do now. If we really do want to keep one directory, here's my current plan of attack: Truly optional components (s3, azure, openstack, kafka, etc), will have a shellprofile built that users can enable by doing the necessary incantations. I'm currently thinking I might be able to add content to hadoop-env.sh at build time to actually turn these things on via a single env-var setting or one per feature. No promises. (Yes, I'm currently looking for my "Black Hat of Bash Wizardry" to make this happen.) Worst case, it'll be a "copy and rename to HADOOP_CONF_DIR". With some help from Ravi Prakash to make me see the forest for the trees, I can now build shell parse-able dependency lists at build time. I have two ways I can process this: I can either store these lists in the hadoop-dist target directory or in the target directory of the actually tools+using a well-known-name+find to build the necessary shell magic at build time. I'm leaning towards the latter since that will allow mvn clean to work in hadoop-dist in an expected way, since there won't be a hidden dependency on hadoop-tools having been run before the mvn package. distch, distcp, archive-logs, etc, are extremely problematic. Using shell profiles for these WILL NOT WORK since they a) aren't really optional and b) removing them from the command line tools won't really help anyone. Currently these commands load all of HADOOP_TOOLS_PATH which is awful. I want to add to libexec/ a tools directory that stores helper functions for tools jars that are required for the various subcommands. It will use similar but different code from the optional components. It will key off a different filename for the dependency list and there will need to be a contract between the helper function names and the dependency file name. (This sounds worse than what it is.) I wish there was a way to dynamically add subcommands to hadoop, mapred, etc, but the code just isn't quite there yet. We can do usage now, but not actually execution. One big question: How should this work proceed? Single patch Multiple patches with a strict commit dependency order Separate branch followed by a branch merge Given this work will likely be all or nothing I'm not a fan of multiple patches.
          Hide
          cnauroth Chris Nauroth added a comment -

          This sounds great overall. Thanks, Allen Wittenauer!

          Worst case, it'll be a "copy and rename to HADOOP_CONF_DIR".

          Or symlinks should be fine too, right?

          I'm leaning towards the latter since that will allow mvn clean to work in hadoop-dist in an expected way, since there won't be a hidden dependency on hadoop-tools having been run before the mvn package.

          Yes, agreed.

          distch, distcp, archive-logs, etc, are extremely problematic.

          Would it make sense to leave these alone as special cases for now and defer improving them to a separate patch? I think the primary benefit of this proposal is improved manageability of the truly optional components.

          As far as managing the work, I agree with ruling out option 2 (multiple patches). Since this will be all-or-nothing, I'd prefer not to put trunk into a strange intermediate state. If you anticipate this is really going to be too big to review at once, then a feature branch would give us flexibility to allow those intermediate states while work continues.

          Overall, this doesn't sound so huge that it would warrant a feature branch though, so I'm in favor of a single patch. Famous last words...

          Show
          cnauroth Chris Nauroth added a comment - This sounds great overall. Thanks, Allen Wittenauer ! Worst case, it'll be a "copy and rename to HADOOP_CONF_DIR". Or symlinks should be fine too, right? I'm leaning towards the latter since that will allow mvn clean to work in hadoop-dist in an expected way, since there won't be a hidden dependency on hadoop-tools having been run before the mvn package. Yes, agreed. distch, distcp, archive-logs, etc, are extremely problematic. Would it make sense to leave these alone as special cases for now and defer improving them to a separate patch? I think the primary benefit of this proposal is improved manageability of the truly optional components. As far as managing the work, I agree with ruling out option 2 (multiple patches). Since this will be all-or-nothing, I'd prefer not to put trunk into a strange intermediate state. If you anticipate this is really going to be too big to review at once, then a feature branch would give us flexibility to allow those intermediate states while work continues. Overall, this doesn't sound so huge that it would warrant a feature branch though, so I'm in favor of a single patch. Famous last words...
          Hide
          aw Allen Wittenauer added a comment -

          Would it make sense to leave these alone as special cases for now and defer improving them to a separate patch? I think the primary benefit of this proposal is improved manageability of the truly optional components.

          Two things lead me to the answer no:

          a) More than half of the bits in hadoop-tools are being called by a script. (I know! it's way more than I expected!) The optional components are in the minority.

          b) We'll definitely end up with duplicate jars in the classpath for those bits. (The classpath de-duper doesn't expand the asterisks.)

          But really, it's not that much extra just do it in one pass. I'll likely have a patch in the next day or so. (ofc, being unemployed helps haha)

          Show
          aw Allen Wittenauer added a comment - Would it make sense to leave these alone as special cases for now and defer improving them to a separate patch? I think the primary benefit of this proposal is improved manageability of the truly optional components. Two things lead me to the answer no: a) More than half of the bits in hadoop-tools are being called by a script. (I know! it's way more than I expected!) The optional components are in the minority. b) We'll definitely end up with duplicate jars in the classpath for those bits. (The classpath de-duper doesn't expand the asterisks.) But really, it's not that much extra just do it in one pass. I'll likely have a patch in the next day or so. (ofc, being unemployed helps haha)
          Hide
          aw Allen Wittenauer added a comment - - edited

          Why does the hdfs haadmin command require hadoop-tools in the classpath? Is this actually a long standing bug/misunderstanding of where toolrunner comes from?

          Show
          aw Allen Wittenauer added a comment - - edited Why does the hdfs haadmin command require hadoop-tools in the classpath? Is this actually a long standing bug/misunderstanding of where toolrunner comes from?
          Hide
          aw Allen Wittenauer added a comment -

          -00:

          tl;dr: HADOOP_TOOLS_PATH is no longer used in the codebase

          • removed toolspath from haadmin because I can't see what it needs from there and mvn dependencies don't list anything either
          • added various HADOOP_TOOLS_* vars to locate content, similar to what is present for the other parts of Hadoop
          • added those entries to the various envvars subcommands
          • added the necessary hooks to build profiles and built-ins
          • changed all of the built-ins to use the specific hooks for them at runtime
          • added generic *_entry handlers to deal with comma delimited options
          • added ability to turn on built-in optional components from hadoop-env.sh without doing anything crazy
          • added and modified quite a few shell unit tests to test all this code
          • added commons-httpclient back to openstack so I could move forward (see HADOOP-12868)

          Todo:

          • need to update the docs for S3, etc, to tell how to turn them on now
          Show
          aw Allen Wittenauer added a comment - -00: tl;dr: HADOOP_TOOLS_PATH is no longer used in the codebase removed toolspath from haadmin because I can't see what it needs from there and mvn dependencies don't list anything either added various HADOOP_TOOLS_* vars to locate content, similar to what is present for the other parts of Hadoop added those entries to the various envvars subcommands added the necessary hooks to build profiles and built-ins changed all of the built-ins to use the specific hooks for them at runtime added generic *_entry handlers to deal with comma delimited options added ability to turn on built-in optional components from hadoop-env.sh without doing anything crazy added and modified quite a few shell unit tests to test all this code added commons-httpclient back to openstack so I could move forward (see HADOOP-12868 ) Todo: need to update the docs for S3, etc, to tell how to turn them on now
          Hide
          cnauroth Chris Nauroth added a comment -

          Why does the hdfs haadmin command require hadoop-tools in the classpath? Is this actually a long standing bug/misunderstanding of where toolrunner comes from?

          I looked through revision history, and it appears that it was always this way, right from the old HDFS-1623 feature branch. I can't think of any good reason for it to do this, so I think it's a bug.

          Show
          cnauroth Chris Nauroth added a comment - Why does the hdfs haadmin command require hadoop-tools in the classpath? Is this actually a long standing bug/misunderstanding of where toolrunner comes from? I looked through revision history, and it appears that it was always this way, right from the old HDFS-1623 feature branch. I can't think of any good reason for it to do this, so I think it's a bug.
          Hide
          aw Allen Wittenauer added a comment -

          OK, so I'm not imagining it. Thanks!

          FYI, -00 has a dumb bug that you won't see if you run the optional bits from the HADOOP_PREFIX dir. Grr. (The profiles aren't getting build with the HADOOP_TOOLS_HOME in the path.)

          I'll wait to see what yetus has to say before posting a new patch though. I'm sure there are whitespace and other issues lol.

          Show
          aw Allen Wittenauer added a comment - OK, so I'm not imagining it. Thanks! FYI, -00 has a dumb bug that you won't see if you run the optional bits from the HADOOP_PREFIX dir. Grr. (The profiles aren't getting build with the HADOOP_TOOLS_HOME in the path.) I'll wait to see what yetus has to say before posting a new patch though. I'm sure there are whitespace and other issues lol.
          Hide
          aw Allen Wittenauer added a comment -

          Argh. Patch is too big for Jenkins to handle.

          Show
          aw Allen Wittenauer added a comment - Argh. Patch is too big for Jenkins to handle.
          Hide
          aw Allen Wittenauer added a comment -

          I'll break this up into four chunks so that jenkins timeout doesn't impact the patch.

          Show
          aw Allen Wittenauer added a comment - I'll break this up into four chunks so that jenkins timeout doesn't impact the patch.
          Hide
          hadoopqa Hadoop QA added a comment -
          -1 overall



          Vote Subsystem Runtime Comment
          0 reexec 0m 13s Docker mode activated.
          0 shelldocs 0m 4s Shelldocs was not available.
          +1 @author 0m 0s The patch does not contain any @author tags.
          +1 test4tests 0m 0s The patch appears to include 5 new or modified test files.
          0 mvndep 0m 25s Maven dependency ordering for branch
          +1 mvninstall 7m 23s trunk passed
          +1 compile 7m 4s trunk passed with JDK v1.8.0_72
          +1 compile 8m 2s trunk passed with JDK v1.7.0_95
          +1 mvnsite 13m 46s trunk passed
          +1 mvneclipse 4m 55s trunk passed
          +1 javadoc 9m 42s trunk passed with JDK v1.8.0_72
          +1 javadoc 13m 55s trunk passed with JDK v1.7.0_95
          0 mvndep 0m 17s Maven dependency ordering for patch
          +1 mvninstall 10m 48s the patch passed
          +1 compile 8m 25s the patch passed with JDK v1.8.0_72
          +1 javac 8m 25s the patch passed
          +1 compile 8m 27s the patch passed with JDK v1.7.0_95
          +1 javac 8m 27s the patch passed
          +1 mvnsite 13m 11s the patch passed
          +1 mvneclipse 4m 30s the patch passed
          -1 shellcheck 0m 12s The applied patch generated 6 new + 99 unchanged - 0 fixed = 105 total (was 99)
          -1 whitespace 0m 0s The patch has 2 line(s) that end in whitespace. Use git apply --whitespace=fix.
          +1 xml 0m 6s The patch has no ill-formed XML file.
          +1 javadoc 8m 54s the patch passed with JDK v1.8.0_72
          +1 javadoc 12m 42s the patch passed with JDK v1.7.0_95
          +1 unit 9m 41s hadoop-common in the patch passed with JDK v1.8.0_72.
          -1 unit 70m 13s hadoop-hdfs in the patch failed with JDK v1.8.0_72.
          -1 unit 99m 2s hadoop-yarn in the patch failed with JDK v1.8.0_72.
          +1 unit 7m 27s hadoop-streaming in the patch passed with JDK v1.8.0_72.
          +1 unit 7m 26s hadoop-distcp in the patch passed with JDK v1.8.0_72.
          +1 unit 0m 51s hadoop-archives in the patch passed with JDK v1.8.0_72.
          +1 unit 0m 34s hadoop-archive-logs in the patch passed with JDK v1.8.0_72.
          +1 unit 0m 22s hadoop-rumen in the patch passed with JDK v1.8.0_72.
          +1 unit 15m 7s hadoop-gridmix in the patch passed with JDK v1.8.0_72.
          +1 unit 0m 26s hadoop-datajoin in the patch passed with JDK v1.8.0_72.
          +1 unit 0m 56s hadoop-extras in the patch passed with JDK v1.8.0_72.
          +1 unit 0m 15s hadoop-openstack in the patch passed with JDK v1.8.0_72.
          +1 unit 0m 17s hadoop-aws in the patch passed with JDK v1.8.0_72.
          +1 unit 1m 22s hadoop-azure in the patch passed with JDK v1.8.0_72.
          +1 unit 0m 58s hadoop-sls in the patch passed with JDK v1.8.0_72.
          +1 unit 0m 14s hadoop-kafka in the patch passed with JDK v1.8.0_72.
          +1 unit 0m 14s hadoop-dist in the patch passed with JDK v1.8.0_72.
          -1 unit 138m 47s hadoop-mapreduce-project in the patch failed with JDK v1.8.0_72.
          +1 unit 9m 41s hadoop-common in the patch passed with JDK v1.7.0_95.
          +1 unit 58m 49s hadoop-hdfs in the patch passed with JDK v1.7.0_95.
          -1 unit 86m 17s hadoop-yarn in the patch failed with JDK v1.7.0_95.
          -1 unit 21m 30s hadoop-streaming in the patch failed with JDK v1.7.0_95.
          -1 unit 8m 18s hadoop-distcp in the patch failed with JDK v1.7.0_95.
          +1 unit 1m 17s hadoop-archives in the patch passed with JDK v1.7.0_95.
          +1 unit 0m 50s hadoop-archive-logs in the patch passed with JDK v1.7.0_95.
          +1 unit 0m 35s hadoop-rumen in the patch passed with JDK v1.7.0_95.
          +1 unit 18m 13s hadoop-gridmix in the patch passed with JDK v1.7.0_95.
          +1 unit 0m 40s hadoop-datajoin in the patch passed with JDK v1.7.0_95.
          +1 unit 1m 17s hadoop-extras in the patch passed with JDK v1.7.0_95.
          +1 unit 0m 29s hadoop-openstack in the patch passed with JDK v1.7.0_95.
          +1 unit 0m 28s hadoop-aws in the patch passed with JDK v1.7.0_95.
          +1 unit 2m 1s hadoop-azure in the patch passed with JDK v1.7.0_95.
          +1 unit 1m 15s hadoop-sls in the patch passed with JDK v1.7.0_95.
          +1 unit 0m 25s hadoop-kafka in the patch passed with JDK v1.7.0_95.
          +1 unit 0m 23s hadoop-dist in the patch passed with JDK v1.7.0_95.
          -1 unit 26m 16s hadoop-mapreduce-project in the patch failed with JDK v1.7.0_95.
          -1 asflicense 0m 45s Patch generated 25 ASF License warnings.
          728m 0s



          Reason Tests
          JDK v1.8.0_72 Failed junit tests hadoop.hdfs.security.TestDelegationToken
            hadoop.hdfs.server.namenode.ha.TestHAAppend
            hadoop.hdfs.security.TestDelegationTokenForProxyUser
            hadoop.yarn.server.resourcemanager.TestAMAuthorization
            hadoop.yarn.server.resourcemanager.TestClientRMTokens
            hadoop.mapreduce.v2.TestMRJobsWithProfiler
            hadoop.mapred.TestNetworkedJob
          JDK v1.8.0_72 Timed out junit tests org.apache.hadoop.yarn.server.resourcemanager.TestLeaderElectorService
          JDK v1.7.0_95 Failed junit tests hadoop.yarn.server.resourcemanager.TestAMAuthorization
            hadoop.yarn.server.resourcemanager.TestClientRMTokens
            hadoop.tools.mapred.lib.TestDynamicInputFormat
            hadoop.mapreduce.v2.TestMRJobsWithProfiler
            hadoop.mapred.TestNetworkedJob
            hadoop.mapreduce.v2.app.job.impl.TestJobImpl
            hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt
          JDK v1.7.0_95 Timed out junit tests org.apache.hadoop.streaming.TestMultipleArchiveFiles



          Subsystem Report/Notes
          Docker Image:yetus/hadoop:0ca8df7
          JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12791106/HADOOP-12857.00.patch
          JIRA Issue HADOOP-12857
          Optional Tests asflicense shellcheck shelldocs mvnsite unit compile javac javadoc mvninstall xml
          uname Linux 0db3e80cf905 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
          Build tool maven
          Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh
          git revision trunk / 27941a1
          Default Java 1.7.0_95
          Multi-JDK versions /usr/lib/jvm/java-8-oracle:1.8.0_72 /usr/lib/jvm/java-7-openjdk-amd64:1.7.0_95
          shellcheck v0.4.3
          shellcheck https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/diff-patch-shellcheck.txt
          whitespace https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/whitespace-eol.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_72.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-yarn-project_hadoop-yarn-jdk1.8.0_72.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-mapreduce-project-jdk1.8.0_72.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-yarn-project_hadoop-yarn-jdk1.7.0_95.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-tools_hadoop-streaming-jdk1.7.0_95.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-tools_hadoop-distcp-jdk1.7.0_95.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-mapreduce-project-jdk1.7.0_95.txt
          unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_72.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-yarn-project_hadoop-yarn-jdk1.8.0_72.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-mapreduce-project-jdk1.8.0_72.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-yarn-project_hadoop-yarn-jdk1.7.0_95.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-tools_hadoop-streaming-jdk1.7.0_95.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-tools_hadoop-distcp-jdk1.7.0_95.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-mapreduce-project-jdk1.7.0_95.txt
          JDK v1.7.0_95 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/testReport/
          asflicense https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-asflicense-problems.txt
          modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs hadoop-yarn-project/hadoop-yarn hadoop-tools/hadoop-streaming hadoop-tools/hadoop-distcp hadoop-tools/hadoop-archives hadoop-tools/hadoop-archive-logs hadoop-tools/hadoop-rumen hadoop-tools/hadoop-gridmix hadoop-tools/hadoop-datajoin hadoop-tools/hadoop-extras hadoop-tools/hadoop-openstack hadoop-tools/hadoop-aws hadoop-tools/hadoop-azure hadoop-tools/hadoop-sls hadoop-tools/hadoop-kafka hadoop-dist hadoop-mapreduce-project U: .
          Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/console
          Powered by Apache Yetus 0.3.0-SNAPSHOT http://yetus.apache.org

          This message was automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - -1 overall Vote Subsystem Runtime Comment 0 reexec 0m 13s Docker mode activated. 0 shelldocs 0m 4s Shelldocs was not available. +1 @author 0m 0s The patch does not contain any @author tags. +1 test4tests 0m 0s The patch appears to include 5 new or modified test files. 0 mvndep 0m 25s Maven dependency ordering for branch +1 mvninstall 7m 23s trunk passed +1 compile 7m 4s trunk passed with JDK v1.8.0_72 +1 compile 8m 2s trunk passed with JDK v1.7.0_95 +1 mvnsite 13m 46s trunk passed +1 mvneclipse 4m 55s trunk passed +1 javadoc 9m 42s trunk passed with JDK v1.8.0_72 +1 javadoc 13m 55s trunk passed with JDK v1.7.0_95 0 mvndep 0m 17s Maven dependency ordering for patch +1 mvninstall 10m 48s the patch passed +1 compile 8m 25s the patch passed with JDK v1.8.0_72 +1 javac 8m 25s the patch passed +1 compile 8m 27s the patch passed with JDK v1.7.0_95 +1 javac 8m 27s the patch passed +1 mvnsite 13m 11s the patch passed +1 mvneclipse 4m 30s the patch passed -1 shellcheck 0m 12s The applied patch generated 6 new + 99 unchanged - 0 fixed = 105 total (was 99) -1 whitespace 0m 0s The patch has 2 line(s) that end in whitespace. Use git apply --whitespace=fix. +1 xml 0m 6s The patch has no ill-formed XML file. +1 javadoc 8m 54s the patch passed with JDK v1.8.0_72 +1 javadoc 12m 42s the patch passed with JDK v1.7.0_95 +1 unit 9m 41s hadoop-common in the patch passed with JDK v1.8.0_72. -1 unit 70m 13s hadoop-hdfs in the patch failed with JDK v1.8.0_72. -1 unit 99m 2s hadoop-yarn in the patch failed with JDK v1.8.0_72. +1 unit 7m 27s hadoop-streaming in the patch passed with JDK v1.8.0_72. +1 unit 7m 26s hadoop-distcp in the patch passed with JDK v1.8.0_72. +1 unit 0m 51s hadoop-archives in the patch passed with JDK v1.8.0_72. +1 unit 0m 34s hadoop-archive-logs in the patch passed with JDK v1.8.0_72. +1 unit 0m 22s hadoop-rumen in the patch passed with JDK v1.8.0_72. +1 unit 15m 7s hadoop-gridmix in the patch passed with JDK v1.8.0_72. +1 unit 0m 26s hadoop-datajoin in the patch passed with JDK v1.8.0_72. +1 unit 0m 56s hadoop-extras in the patch passed with JDK v1.8.0_72. +1 unit 0m 15s hadoop-openstack in the patch passed with JDK v1.8.0_72. +1 unit 0m 17s hadoop-aws in the patch passed with JDK v1.8.0_72. +1 unit 1m 22s hadoop-azure in the patch passed with JDK v1.8.0_72. +1 unit 0m 58s hadoop-sls in the patch passed with JDK v1.8.0_72. +1 unit 0m 14s hadoop-kafka in the patch passed with JDK v1.8.0_72. +1 unit 0m 14s hadoop-dist in the patch passed with JDK v1.8.0_72. -1 unit 138m 47s hadoop-mapreduce-project in the patch failed with JDK v1.8.0_72. +1 unit 9m 41s hadoop-common in the patch passed with JDK v1.7.0_95. +1 unit 58m 49s hadoop-hdfs in the patch passed with JDK v1.7.0_95. -1 unit 86m 17s hadoop-yarn in the patch failed with JDK v1.7.0_95. -1 unit 21m 30s hadoop-streaming in the patch failed with JDK v1.7.0_95. -1 unit 8m 18s hadoop-distcp in the patch failed with JDK v1.7.0_95. +1 unit 1m 17s hadoop-archives in the patch passed with JDK v1.7.0_95. +1 unit 0m 50s hadoop-archive-logs in the patch passed with JDK v1.7.0_95. +1 unit 0m 35s hadoop-rumen in the patch passed with JDK v1.7.0_95. +1 unit 18m 13s hadoop-gridmix in the patch passed with JDK v1.7.0_95. +1 unit 0m 40s hadoop-datajoin in the patch passed with JDK v1.7.0_95. +1 unit 1m 17s hadoop-extras in the patch passed with JDK v1.7.0_95. +1 unit 0m 29s hadoop-openstack in the patch passed with JDK v1.7.0_95. +1 unit 0m 28s hadoop-aws in the patch passed with JDK v1.7.0_95. +1 unit 2m 1s hadoop-azure in the patch passed with JDK v1.7.0_95. +1 unit 1m 15s hadoop-sls in the patch passed with JDK v1.7.0_95. +1 unit 0m 25s hadoop-kafka in the patch passed with JDK v1.7.0_95. +1 unit 0m 23s hadoop-dist in the patch passed with JDK v1.7.0_95. -1 unit 26m 16s hadoop-mapreduce-project in the patch failed with JDK v1.7.0_95. -1 asflicense 0m 45s Patch generated 25 ASF License warnings. 728m 0s Reason Tests JDK v1.8.0_72 Failed junit tests hadoop.hdfs.security.TestDelegationToken   hadoop.hdfs.server.namenode.ha.TestHAAppend   hadoop.hdfs.security.TestDelegationTokenForProxyUser   hadoop.yarn.server.resourcemanager.TestAMAuthorization   hadoop.yarn.server.resourcemanager.TestClientRMTokens   hadoop.mapreduce.v2.TestMRJobsWithProfiler   hadoop.mapred.TestNetworkedJob JDK v1.8.0_72 Timed out junit tests org.apache.hadoop.yarn.server.resourcemanager.TestLeaderElectorService JDK v1.7.0_95 Failed junit tests hadoop.yarn.server.resourcemanager.TestAMAuthorization   hadoop.yarn.server.resourcemanager.TestClientRMTokens   hadoop.tools.mapred.lib.TestDynamicInputFormat   hadoop.mapreduce.v2.TestMRJobsWithProfiler   hadoop.mapred.TestNetworkedJob   hadoop.mapreduce.v2.app.job.impl.TestJobImpl   hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt JDK v1.7.0_95 Timed out junit tests org.apache.hadoop.streaming.TestMultipleArchiveFiles Subsystem Report/Notes Docker Image:yetus/hadoop:0ca8df7 JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12791106/HADOOP-12857.00.patch JIRA Issue HADOOP-12857 Optional Tests asflicense shellcheck shelldocs mvnsite unit compile javac javadoc mvninstall xml uname Linux 0db3e80cf905 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux Build tool maven Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh git revision trunk / 27941a1 Default Java 1.7.0_95 Multi-JDK versions /usr/lib/jvm/java-8-oracle:1.8.0_72 /usr/lib/jvm/java-7-openjdk-amd64:1.7.0_95 shellcheck v0.4.3 shellcheck https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/diff-patch-shellcheck.txt whitespace https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/whitespace-eol.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_72.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-yarn-project_hadoop-yarn-jdk1.8.0_72.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-mapreduce-project-jdk1.8.0_72.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-yarn-project_hadoop-yarn-jdk1.7.0_95.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-tools_hadoop-streaming-jdk1.7.0_95.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-tools_hadoop-distcp-jdk1.7.0_95.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-mapreduce-project-jdk1.7.0_95.txt unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_72.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-yarn-project_hadoop-yarn-jdk1.8.0_72.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-mapreduce-project-jdk1.8.0_72.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-yarn-project_hadoop-yarn-jdk1.7.0_95.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-tools_hadoop-streaming-jdk1.7.0_95.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-tools_hadoop-distcp-jdk1.7.0_95.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-unit-hadoop-mapreduce-project-jdk1.7.0_95.txt JDK v1.7.0_95 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/testReport/ asflicense https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/artifact/patchprocess/patch-asflicense-problems.txt modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs hadoop-yarn-project/hadoop-yarn hadoop-tools/hadoop-streaming hadoop-tools/hadoop-distcp hadoop-tools/hadoop-archives hadoop-tools/hadoop-archive-logs hadoop-tools/hadoop-rumen hadoop-tools/hadoop-gridmix hadoop-tools/hadoop-datajoin hadoop-tools/hadoop-extras hadoop-tools/hadoop-openstack hadoop-tools/hadoop-aws hadoop-tools/hadoop-azure hadoop-tools/hadoop-sls hadoop-tools/hadoop-kafka hadoop-dist hadoop-mapreduce-project U: . Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8770/console Powered by Apache Yetus 0.3.0-SNAPSHOT http://yetus.apache.org This message was automatically generated.
          Hide
          aw Allen Wittenauer added a comment -

          728m

          lol so the results are completely inaccurate.

          Show
          aw Allen Wittenauer added a comment - 728m lol so the results are completely inaccurate.
          Hide
          aw Allen Wittenauer added a comment -

          OK, (manually) broken up into 4 patches that should be Yetus friendly. Also fixed the above error and all known whitespace and shellcheck errors. I'll upload a combined version as -01 here.

          Show
          aw Allen Wittenauer added a comment - OK, (manually) broken up into 4 patches that should be Yetus friendly. Also fixed the above error and all known whitespace and shellcheck errors. I'll upload a combined version as -01 here.
          Hide
          aw Allen Wittenauer added a comment -

          -01:

          • combination of '80-'83 subtasks
          • fixes a few shellcheck errors (including reworking document_optionals)
          • fixes a few whitespace errors
          • full path now used for optional components
          • hadoop-openstack dependency plugin wasn't getting configured
          • set the mode on hadoop-layout.sh.example

          (I won't be submitting this to Jenkins.)

          Show
          aw Allen Wittenauer added a comment - -01: combination of '80-'83 subtasks fixes a few shellcheck errors (including reworking document_optionals) fixes a few whitespace errors full path now used for optional components hadoop-openstack dependency plugin wasn't getting configured set the mode on hadoop-layout.sh.example (I won't be submitting this to Jenkins.)
          Hide
          aw Allen Wittenauer added a comment - - edited

          -02:

          • documentation
          • eliminate HADOOP_TOOLS_PATH since it makes zero sense anymore with this layout and the other capabilities of the shell code in trunk
          • rework to hopefully work with Windows.

          Should I break this apart to send through Jenkins or ... ?

          Show
          aw Allen Wittenauer added a comment - - edited -02: documentation eliminate HADOOP_TOOLS_PATH since it makes zero sense anymore with this layout and the other capabilities of the shell code in trunk rework to hopefully work with Windows. Should I break this apart to send through Jenkins or ... ?
          Hide
          aw Allen Wittenauer added a comment -

          Manual run without unit tests:

          Vote Subsystem Runtime Comment
          ============================================================================
          0 reexec 0m 21s Docker mode activated.
          0 shelldocs 0m 4s Shelldocs was not available.
          +1 @author 0m 0s The patch does not contain any @author
                tags.
          +1 test4tests 0m 0s The patch appears to include 6 new or
                modified test files.
          0 mvndep 3m 48s Maven dependency ordering for branch
          +1 mvninstall 11m 42s trunk passed
          +1 compile 13m 16s trunk passed
          +1 mvnsite 12m 55s trunk passed
          +1 mvneclipse 2m 37s trunk passed
          +1 javadoc 10m 23s trunk passed
          0 mvndep 0m 20s Maven dependency ordering for patch
          +1 mvninstall 23m 27s the patch passed
          +1 compile 11m 19s the patch passed
          +1 javac 11m 19s the patch passed
          +1 mvnsite 13m 4s the patch passed
          +1 mvneclipse 0m 57s the patch passed
          +1 shellcheck 0m 7s The applied patch generated 0 new + 94
                unchanged - 5 fixed = 94 total (was 99)
          +1 whitespace 0m 0s Patch has no whitespace issues.
          +1 xml 0m 5s The patch has no ill-formed XML file.
          +1 javadoc 10m 53s the patch passed
          +1 asflicense 0m 24s Patch does not generate ASF License
                warnings.
              116m 18s

          Manually running unit tests:

           hadoop-common-project/hadoop-common$ mvn test -DskipTests -Pshelltest
          
          [INFO] --- maven-antrun-plugin:1.7:run (common-test-bats-driver) @ hadoop-common ---
          [INFO] Executing tasks
          
          main:
               [exec] Running bats -t hadoop_add_classpath.bats
               [exec] 1..11
               [exec] ok 1 hadoop_add_classpath (simple not exist)
               [exec] ok 2 hadoop_add_classpath (simple wildcard not exist)
               [exec] ok 3 hadoop_add_classpath (simple exist)
               [exec] ok 4 hadoop_add_classpath (simple wildcard exist)
               [exec] ok 5 hadoop_add_classpath (simple dupecheck)
               [exec] ok 6 hadoop_add_classpath (default order)
               [exec] ok 7 hadoop_add_classpath (after order)
               [exec] ok 8 hadoop_add_classpath (before order)
               [exec] ok 9 hadoop_add_classpath (simple dupecheck 2)
               [exec] ok 10 hadoop_add_classpath (dupecheck 3)
               [exec] ok 11 hadoop_add_classpath (complex ordering)
               [exec] Running bats -t hadoop_add_colonpath.bats
               [exec] 1..9
               [exec] ok 1 hadoop_add_colonpath (simple not exist)
               [exec] ok 2 hadoop_add_colonpath (simple exist)
               [exec] ok 3 hadoop_add_colonpath (simple dupecheck)
               [exec] ok 4 hadoop_add_colonpath (default order)
               [exec] ok 5 hadoop_add_colonpath (after order)
               [exec] ok 6 hadoop_add_colonpath (before order)
               [exec] ok 7 hadoop_add_colonpath (simple dupecheck 2)
               [exec] ok 8 hadoop_add_colonpath (dupecheck 3)
               [exec] ok 9 hadoop_add_colonpath (complex ordering)
               [exec] Running bats -t hadoop_add_common_to_classpath.bats
               [exec] 1..3
               [exec] ok 1 hadoop_add_common_to_classpath (negative)
               [exec] ok 2 hadoop_add_common_to_classpath (positive)
               [exec] ok 3 hadoop_add_common_to_classpath (build paths)
               [exec] Running bats -t hadoop_add_javalibpath.bats
               [exec] 1..9
               [exec] ok 1 hadoop_add_javalibpath (simple not exist)
               [exec] ok 2 hadoop_add_javalibpath (simple exist)
               [exec] ok 3 hadoop_add_javalibpath (simple dupecheck)
               [exec] ok 4 hadoop_add_javalibpath (default order)
               [exec] ok 5 hadoop_add_javalibpath (after order)
               [exec] ok 6 hadoop_add_javalibpath (before order)
               [exec] ok 7 hadoop_add_javalibpath (simple dupecheck 2)
               [exec] ok 8 hadoop_add_javalibpath (dupecheck 3)
               [exec] ok 9 hadoop_add_javalibpath (complex ordering)
               [exec] Running bats -t hadoop_add_ldlibpath.bats
               [exec] 1..9
               [exec] ok 1 hadoop_add_ldlibpath (simple not exist)
               [exec] ok 2 hadoop_add_ldlibpath (simple exist)
               [exec] ok 3 hadoop_add_ldlibpath (simple dupecheck)
               [exec] ok 4 hadoop_add_ldlibpath (default order)
               [exec] ok 5 hadoop_add_ldlibpath (after order)
               [exec] ok 6 hadoop_add_ldlibpath (before order)
               [exec] ok 7 hadoop_add_ldlibpath (simple dupecheck 2)
               [exec] ok 8 hadoop_add_ldlibpath (dupecheck 3)
               [exec] ok 9 hadoop_add_ldlibpath (complex ordering)
               [exec] Running bats -t hadoop_add_param.bats
               [exec] 1..4
               [exec] ok 1 hadoop_add_param (positive 1)
               [exec] ok 2 hadoop_add_param (negative)
               [exec] ok 3 hadoop_add_param (positive 2)
               [exec] ok 4 hadoop_add_param (positive 3)
               [exec] Running bats -t hadoop_add_to_classpath_tools.bats
               [exec] 1..3
               [exec] ok 1 hadoop_classpath_tools (load)
               [exec] ok 2 hadoop_classpath_tools (not exist)
               [exec] ok 3 hadoop_classpath_tools (function)
               [exec] Running bats -t hadoop_add_to_classpath_userpath.bats
               [exec] 1..7
               [exec] ok 1 hadoop_add_to_classpath_userpath (nothing)
               [exec] ok 2 hadoop_add_to_classpath_userpath (none)
               [exec] ok 3 hadoop_add_to_classpath_userpath (only)
               [exec] ok 4 hadoop_add_to_classpath_userpath (classloader)
               [exec] ok 5 hadoop_add_to_classpath_userpath (1+1 dupe)
               [exec] ok 6 hadoop_add_to_classpath_userpath (3+2 after)
               [exec] ok 7 hadoop_add_to_classpath_userpath (3+2 before)
               [exec] Running bats -t hadoop_basic_init.bats
               [exec] 1..3
               [exec] ok 1 hadoop_basic_init (bad dir errors)
               [exec] ok 2 hadoop_basic_init (no non-dir overrides)
               [exec] ok 3 hadoop_basic_init (test non-dir overrides)
               [exec] Running bats -t hadoop_bootstrap.bats
               [exec] 1..2
               [exec] ok 1 hadoop_bootstrap (no libexec)
               [exec] ok 2 hadoop_bootstrap (libexec)
               [exec] Running bats -t hadoop_confdir.bats
               [exec] 1..9
               [exec] ok 1 hadoop_find_confdir (default)
               [exec] ok 2 hadoop_find_confdir (bw compat: conf)
               [exec] ok 3 hadoop_find_confdir (etc/hadoop)
               [exec] ok 4 hadoop_verify_confdir (negative) 
               [exec] ok 5 hadoop_verify_confdir (positive) 
               [exec] ok 6 hadoop_exec_hadoopenv (positive) 
               [exec] ok 7 hadoop_exec_hadoopenv (negative) 
               [exec] ok 8 hadoop_exec_userfuncs
               [exec] ok 9 hadoop_exec_hadooprc
               [exec] Running bats -t hadoop_deprecate_envvar.bats
               [exec] 1..2
               [exec] ok 1 hadoop_deprecate_envvar (replace)
               [exec] ok 2 hadoop_deprecate_envvar (no replace)
               [exec] Running bats -t hadoop_entry_tests.bats
               [exec] 1..4
               [exec] ok 1 hadoop_add_entry (positive 1)
               [exec] ok 2 hadoop_add_entry (negative)
               [exec] ok 3 hadoop_add_entry (positive 2)
               [exec] ok 4 hadoop_add_entry (positive 3)
               [exec] Running bats -t hadoop_finalize.bats
               [exec] 1..11
               [exec] ok 1 hadoop_finalize (shellprofiles)
               [exec] ok 2 hadoop_finalize (classpath)
               [exec] ok 3 hadoop_finalize (libpaths)
               [exec] ok 4 hadoop_finalize (heap)
               [exec] ok 5 hadoop_finalize (opts)
               [exec] ok 6 hadoop_finalize (cygwin prefix)
               [exec] ok 7 hadoop_finalize (cygwin conf dir)
               [exec] ok 8 hadoop_finalize (cygwin common)
               [exec] ok 9 hadoop_finalize (cygwin hdfs)
               [exec] ok 10 hadoop_finalize (cygwin yarn)
               [exec] ok 11 hadoop_finalize (cygwin mapred)
               [exec] Running bats -t hadoop_finalize_catalina_opts.bats
               [exec] 1..2
               [exec] ok 1 hadoop_finalize_catalina_opts (raw)
               [exec] ok 2 # skip (catalina commands not supported under cygwin yet) hadoop_finalize_catalina_opts (cygwin)
               [exec] Running bats -t hadoop_finalize_classpath.bats
               [exec] 1..4
               [exec] ok 1 hadoop_finalize_classpath (only conf dir)
               [exec] ok 2 hadoop_finalize_classpath (before conf dir)
               [exec] ok 3 hadoop_finalize_classpath (adds user)
               [exec] ok 4 hadoop_finalize_classpath (calls cygwin)
               [exec] Running bats -t hadoop_finalize_hadoop_heap.bats
               [exec] 1..8
               [exec] ok 1 hadoop_finalize_hadoop_heap (negative)
               [exec] ok 2 hadoop_finalize_hadoop_heap (no unit max)
               [exec] ok 3 hadoop_finalize_hadoop_heap (no unit old)
               [exec] ok 4 hadoop_finalize_hadoop_heap (unit max)
               [exec] ok 5 hadoop_finalize_hadoop_heap (unit old)
               [exec] ok 6 hadoop_finalize_hadoop_heap (no unit min)
               [exec] ok 7 hadoop_finalize_hadoop_heap (unit min)
               [exec] ok 8 hadoop_finalize_hadoop_heap (dedupe)
               [exec] Running bats -t hadoop_finalize_hadoop_opts.bats
               [exec] 1..2
               [exec] ok 1 hadoop_finalize_hadoop_opts (raw)
               [exec] ok 2 hadoop_finalize_hadoop_opts (cygwin)
               [exec] Running bats -t hadoop_finalize_libpaths.bats
               [exec] 1..2
               [exec] ok 1 hadoop_finalize_libpaths (negative)
               [exec] ok 2 hadoop_finalize_libpaths (positive)
               [exec] Running bats -t hadoop_java_setup.bats
               [exec] 1..4
               [exec] ok 1 hadoop_java_setup (negative not set)
               [exec] ok 2 hadoop_java_setup (negative not a dir)
               [exec] ok 3 hadoop_java_setup (negative not exec)
               [exec] ok 4 hadoop_java_setup (positive)
               [exec] Running bats -t hadoop_os_tricks.bats
               [exec] 1..3
               [exec] ok 1 hadoop_os_tricks (cygwin sets cygwin)
               [exec] ok 2 hadoop_os_tricks (linux sets arena max)
               [exec] ok 3 hadoop_os_tricks (osx sets java_home)
               [exec] Running bats -t hadoop_rotate_log.bats
               [exec] 1..4
               [exec] ok 1 hadoop_rotate_log (defaults)
               [exec] ok 2 hadoop_rotate_log (one archive log)
               [exec] ok 3 hadoop_rotate_log (default five archive logs)
               [exec] ok 4 hadoop_rotate_log (ten archive logs)
               [exec] Running bats -t hadoop_shellprofile.bats
               [exec] 1..9
               [exec] ok 1 hadoop_import_shellprofiles (negative)
               [exec] ok 2 hadoop_import_shellprofiles (libexec sh import)
               [exec] ok 3 hadoop_import_shellprofiles (libexec conf sh import+override)
               [exec] ok 4 hadoop_import_shellprofiles (libexec no cmd import)
               [exec] ok 5 hadoop_import_shellprofiles (H_O_T)
               [exec] ok 6 hadoop_add_profile+hadoop_shellprofiles_init
               [exec] ok 7 hadoop_add_profile+hadoop_shellprofiles_classpath
               [exec] ok 8 hadoop_add_profile+hadoop_shellprofiles_nativelib
               [exec] ok 9 hadoop_add_profile+hadoop_shellprofiles_finalize
               [exec] Running bats -t hadoop_slaves.bats
               [exec] 1..3
               [exec] ok 1 hadoop_populate_slaves_file (specific file)
               [exec] ok 2 hadoop_populate_slaves_file (specific conf dir file)
               [exec] ok 3 hadoop_populate_slaves_file (no file)
               [exec] Running bats -t hadoop_ssh.bats
               [exec] 1..7
               [exec] ok 1 # skip (Not implemented) hadoop_actual_ssh
               [exec] ok 2 # skip (Not implemented) hadoop_connect_to_hosts
               [exec] ok 3 # skip (Not implemented) hadoop_connect_to_hosts_without_pdsh
               [exec] ok 4 hadoop_common_slave_mode_execute (--slaves 1)
               [exec] ok 5 hadoop_common_slave_mode_execute (--slaves 2)
               [exec] ok 6 hadoop_common_slave_mode_execute (--hosts)
               [exec] ok 7 hadoop_common_slave_mode_execute (--hostnames 2)
               [exec] Running bats -t hadoop_stop_daemon.bats
               [exec] 1..1
               [exec] ok 1 hadoop_stop_daemon
               [exec] Running bats -t hadoop_stop_secure_daemon.bats
               [exec] 1..1
               [exec] ok 1 hadoop_stop_secure_daemon
               [exec] Running bats -t hadoop_translate_cygwin_path.bats
               [exec] 1..3
               [exec] ok 1 hadoop_translate_cygwin_path (negative)
               [exec] ok 2 hadoop_translate_cygwin_path (positive)
               [exec] ok 3 hadoop_translate_cygwin_path (path positive)
               [exec] Running bats -t hadoop_validate_classname.bats
               [exec] 1..2
               [exec] ok 1 hadoop_validate_classname (negative)
               [exec] ok 2 hadoop_validate_classname (positive)
          [INFO] Executed tasks
          [INFO] ------------------------------------------------------------------------
          [INFO] BUILD SUCCESS
          [INFO] ------------------------------------------------------------------------
          [INFO] Total time: 51.188s
          [INFO] Finished at: Thu Mar 10 11:42:32 PST 2016
          [INFO] Final Memory: 26M/84M
          [INFO] ------------------------------------------------------------------------
          
          Show
          aw Allen Wittenauer added a comment - Manual run without unit tests: Vote Subsystem Runtime Comment ============================================================================ 0 reexec 0m 21s Docker mode activated. 0 shelldocs 0m 4s Shelldocs was not available. +1 @author 0m 0s The patch does not contain any @author       tags. +1 test4tests 0m 0s The patch appears to include 6 new or       modified test files. 0 mvndep 3m 48s Maven dependency ordering for branch +1 mvninstall 11m 42s trunk passed +1 compile 13m 16s trunk passed +1 mvnsite 12m 55s trunk passed +1 mvneclipse 2m 37s trunk passed +1 javadoc 10m 23s trunk passed 0 mvndep 0m 20s Maven dependency ordering for patch +1 mvninstall 23m 27s the patch passed +1 compile 11m 19s the patch passed +1 javac 11m 19s the patch passed +1 mvnsite 13m 4s the patch passed +1 mvneclipse 0m 57s the patch passed +1 shellcheck 0m 7s The applied patch generated 0 new + 94       unchanged - 5 fixed = 94 total (was 99) +1 whitespace 0m 0s Patch has no whitespace issues. +1 xml 0m 5s The patch has no ill-formed XML file. +1 javadoc 10m 53s the patch passed +1 asflicense 0m 24s Patch does not generate ASF License       warnings.     116m 18s Manually running unit tests: hadoop-common-project/hadoop-common$ mvn test -DskipTests -Pshelltest [INFO] --- maven-antrun-plugin:1.7:run (common-test-bats-driver) @ hadoop-common --- [INFO] Executing tasks main: [exec] Running bats -t hadoop_add_classpath.bats [exec] 1..11 [exec] ok 1 hadoop_add_classpath (simple not exist) [exec] ok 2 hadoop_add_classpath (simple wildcard not exist) [exec] ok 3 hadoop_add_classpath (simple exist) [exec] ok 4 hadoop_add_classpath (simple wildcard exist) [exec] ok 5 hadoop_add_classpath (simple dupecheck) [exec] ok 6 hadoop_add_classpath ( default order) [exec] ok 7 hadoop_add_classpath (after order) [exec] ok 8 hadoop_add_classpath (before order) [exec] ok 9 hadoop_add_classpath (simple dupecheck 2) [exec] ok 10 hadoop_add_classpath (dupecheck 3) [exec] ok 11 hadoop_add_classpath (complex ordering) [exec] Running bats -t hadoop_add_colonpath.bats [exec] 1..9 [exec] ok 1 hadoop_add_colonpath (simple not exist) [exec] ok 2 hadoop_add_colonpath (simple exist) [exec] ok 3 hadoop_add_colonpath (simple dupecheck) [exec] ok 4 hadoop_add_colonpath ( default order) [exec] ok 5 hadoop_add_colonpath (after order) [exec] ok 6 hadoop_add_colonpath (before order) [exec] ok 7 hadoop_add_colonpath (simple dupecheck 2) [exec] ok 8 hadoop_add_colonpath (dupecheck 3) [exec] ok 9 hadoop_add_colonpath (complex ordering) [exec] Running bats -t hadoop_add_common_to_classpath.bats [exec] 1..3 [exec] ok 1 hadoop_add_common_to_classpath (negative) [exec] ok 2 hadoop_add_common_to_classpath (positive) [exec] ok 3 hadoop_add_common_to_classpath (build paths) [exec] Running bats -t hadoop_add_javalibpath.bats [exec] 1..9 [exec] ok 1 hadoop_add_javalibpath (simple not exist) [exec] ok 2 hadoop_add_javalibpath (simple exist) [exec] ok 3 hadoop_add_javalibpath (simple dupecheck) [exec] ok 4 hadoop_add_javalibpath ( default order) [exec] ok 5 hadoop_add_javalibpath (after order) [exec] ok 6 hadoop_add_javalibpath (before order) [exec] ok 7 hadoop_add_javalibpath (simple dupecheck 2) [exec] ok 8 hadoop_add_javalibpath (dupecheck 3) [exec] ok 9 hadoop_add_javalibpath (complex ordering) [exec] Running bats -t hadoop_add_ldlibpath.bats [exec] 1..9 [exec] ok 1 hadoop_add_ldlibpath (simple not exist) [exec] ok 2 hadoop_add_ldlibpath (simple exist) [exec] ok 3 hadoop_add_ldlibpath (simple dupecheck) [exec] ok 4 hadoop_add_ldlibpath ( default order) [exec] ok 5 hadoop_add_ldlibpath (after order) [exec] ok 6 hadoop_add_ldlibpath (before order) [exec] ok 7 hadoop_add_ldlibpath (simple dupecheck 2) [exec] ok 8 hadoop_add_ldlibpath (dupecheck 3) [exec] ok 9 hadoop_add_ldlibpath (complex ordering) [exec] Running bats -t hadoop_add_param.bats [exec] 1..4 [exec] ok 1 hadoop_add_param (positive 1) [exec] ok 2 hadoop_add_param (negative) [exec] ok 3 hadoop_add_param (positive 2) [exec] ok 4 hadoop_add_param (positive 3) [exec] Running bats -t hadoop_add_to_classpath_tools.bats [exec] 1..3 [exec] ok 1 hadoop_classpath_tools (load) [exec] ok 2 hadoop_classpath_tools (not exist) [exec] ok 3 hadoop_classpath_tools (function) [exec] Running bats -t hadoop_add_to_classpath_userpath.bats [exec] 1..7 [exec] ok 1 hadoop_add_to_classpath_userpath (nothing) [exec] ok 2 hadoop_add_to_classpath_userpath (none) [exec] ok 3 hadoop_add_to_classpath_userpath (only) [exec] ok 4 hadoop_add_to_classpath_userpath (classloader) [exec] ok 5 hadoop_add_to_classpath_userpath (1+1 dupe) [exec] ok 6 hadoop_add_to_classpath_userpath (3+2 after) [exec] ok 7 hadoop_add_to_classpath_userpath (3+2 before) [exec] Running bats -t hadoop_basic_init.bats [exec] 1..3 [exec] ok 1 hadoop_basic_init (bad dir errors) [exec] ok 2 hadoop_basic_init (no non-dir overrides) [exec] ok 3 hadoop_basic_init (test non-dir overrides) [exec] Running bats -t hadoop_bootstrap.bats [exec] 1..2 [exec] ok 1 hadoop_bootstrap (no libexec) [exec] ok 2 hadoop_bootstrap (libexec) [exec] Running bats -t hadoop_confdir.bats [exec] 1..9 [exec] ok 1 hadoop_find_confdir ( default ) [exec] ok 2 hadoop_find_confdir (bw compat: conf) [exec] ok 3 hadoop_find_confdir (etc/hadoop) [exec] ok 4 hadoop_verify_confdir (negative) [exec] ok 5 hadoop_verify_confdir (positive) [exec] ok 6 hadoop_exec_hadoopenv (positive) [exec] ok 7 hadoop_exec_hadoopenv (negative) [exec] ok 8 hadoop_exec_userfuncs [exec] ok 9 hadoop_exec_hadooprc [exec] Running bats -t hadoop_deprecate_envvar.bats [exec] 1..2 [exec] ok 1 hadoop_deprecate_envvar (replace) [exec] ok 2 hadoop_deprecate_envvar (no replace) [exec] Running bats -t hadoop_entry_tests.bats [exec] 1..4 [exec] ok 1 hadoop_add_entry (positive 1) [exec] ok 2 hadoop_add_entry (negative) [exec] ok 3 hadoop_add_entry (positive 2) [exec] ok 4 hadoop_add_entry (positive 3) [exec] Running bats -t hadoop_finalize.bats [exec] 1..11 [exec] ok 1 hadoop_finalize (shellprofiles) [exec] ok 2 hadoop_finalize (classpath) [exec] ok 3 hadoop_finalize (libpaths) [exec] ok 4 hadoop_finalize (heap) [exec] ok 5 hadoop_finalize (opts) [exec] ok 6 hadoop_finalize (cygwin prefix) [exec] ok 7 hadoop_finalize (cygwin conf dir) [exec] ok 8 hadoop_finalize (cygwin common) [exec] ok 9 hadoop_finalize (cygwin hdfs) [exec] ok 10 hadoop_finalize (cygwin yarn) [exec] ok 11 hadoop_finalize (cygwin mapred) [exec] Running bats -t hadoop_finalize_catalina_opts.bats [exec] 1..2 [exec] ok 1 hadoop_finalize_catalina_opts (raw) [exec] ok 2 # skip (catalina commands not supported under cygwin yet) hadoop_finalize_catalina_opts (cygwin) [exec] Running bats -t hadoop_finalize_classpath.bats [exec] 1..4 [exec] ok 1 hadoop_finalize_classpath (only conf dir) [exec] ok 2 hadoop_finalize_classpath (before conf dir) [exec] ok 3 hadoop_finalize_classpath (adds user) [exec] ok 4 hadoop_finalize_classpath (calls cygwin) [exec] Running bats -t hadoop_finalize_hadoop_heap.bats [exec] 1..8 [exec] ok 1 hadoop_finalize_hadoop_heap (negative) [exec] ok 2 hadoop_finalize_hadoop_heap (no unit max) [exec] ok 3 hadoop_finalize_hadoop_heap (no unit old) [exec] ok 4 hadoop_finalize_hadoop_heap (unit max) [exec] ok 5 hadoop_finalize_hadoop_heap (unit old) [exec] ok 6 hadoop_finalize_hadoop_heap (no unit min) [exec] ok 7 hadoop_finalize_hadoop_heap (unit min) [exec] ok 8 hadoop_finalize_hadoop_heap (dedupe) [exec] Running bats -t hadoop_finalize_hadoop_opts.bats [exec] 1..2 [exec] ok 1 hadoop_finalize_hadoop_opts (raw) [exec] ok 2 hadoop_finalize_hadoop_opts (cygwin) [exec] Running bats -t hadoop_finalize_libpaths.bats [exec] 1..2 [exec] ok 1 hadoop_finalize_libpaths (negative) [exec] ok 2 hadoop_finalize_libpaths (positive) [exec] Running bats -t hadoop_java_setup.bats [exec] 1..4 [exec] ok 1 hadoop_java_setup (negative not set) [exec] ok 2 hadoop_java_setup (negative not a dir) [exec] ok 3 hadoop_java_setup (negative not exec) [exec] ok 4 hadoop_java_setup (positive) [exec] Running bats -t hadoop_os_tricks.bats [exec] 1..3 [exec] ok 1 hadoop_os_tricks (cygwin sets cygwin) [exec] ok 2 hadoop_os_tricks (linux sets arena max) [exec] ok 3 hadoop_os_tricks (osx sets java_home) [exec] Running bats -t hadoop_rotate_log.bats [exec] 1..4 [exec] ok 1 hadoop_rotate_log (defaults) [exec] ok 2 hadoop_rotate_log (one archive log) [exec] ok 3 hadoop_rotate_log ( default five archive logs) [exec] ok 4 hadoop_rotate_log (ten archive logs) [exec] Running bats -t hadoop_shellprofile.bats [exec] 1..9 [exec] ok 1 hadoop_import_shellprofiles (negative) [exec] ok 2 hadoop_import_shellprofiles (libexec sh import ) [exec] ok 3 hadoop_import_shellprofiles (libexec conf sh import +override) [exec] ok 4 hadoop_import_shellprofiles (libexec no cmd import ) [exec] ok 5 hadoop_import_shellprofiles (H_O_T) [exec] ok 6 hadoop_add_profile+hadoop_shellprofiles_init [exec] ok 7 hadoop_add_profile+hadoop_shellprofiles_classpath [exec] ok 8 hadoop_add_profile+hadoop_shellprofiles_nativelib [exec] ok 9 hadoop_add_profile+hadoop_shellprofiles_finalize [exec] Running bats -t hadoop_slaves.bats [exec] 1..3 [exec] ok 1 hadoop_populate_slaves_file (specific file) [exec] ok 2 hadoop_populate_slaves_file (specific conf dir file) [exec] ok 3 hadoop_populate_slaves_file (no file) [exec] Running bats -t hadoop_ssh.bats [exec] 1..7 [exec] ok 1 # skip (Not implemented) hadoop_actual_ssh [exec] ok 2 # skip (Not implemented) hadoop_connect_to_hosts [exec] ok 3 # skip (Not implemented) hadoop_connect_to_hosts_without_pdsh [exec] ok 4 hadoop_common_slave_mode_execute (--slaves 1) [exec] ok 5 hadoop_common_slave_mode_execute (--slaves 2) [exec] ok 6 hadoop_common_slave_mode_execute (--hosts) [exec] ok 7 hadoop_common_slave_mode_execute (--hostnames 2) [exec] Running bats -t hadoop_stop_daemon.bats [exec] 1..1 [exec] ok 1 hadoop_stop_daemon [exec] Running bats -t hadoop_stop_secure_daemon.bats [exec] 1..1 [exec] ok 1 hadoop_stop_secure_daemon [exec] Running bats -t hadoop_translate_cygwin_path.bats [exec] 1..3 [exec] ok 1 hadoop_translate_cygwin_path (negative) [exec] ok 2 hadoop_translate_cygwin_path (positive) [exec] ok 3 hadoop_translate_cygwin_path (path positive) [exec] Running bats -t hadoop_validate_classname.bats [exec] 1..2 [exec] ok 1 hadoop_validate_classname (negative) [exec] ok 2 hadoop_validate_classname (positive) [INFO] Executed tasks [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 51.188s [INFO] Finished at: Thu Mar 10 11:42:32 PST 2016 [INFO] Final Memory: 26M/84M [INFO] ------------------------------------------------------------------------
          Hide
          aw Allen Wittenauer added a comment -

          I wish there was a way to dynamically add subcommands to hadoop, mapred, etc, but the code just isn't quite there yet. We can do usage now, but not actually execution.

          I know how to do this now in a very clean way that would even allow 3rd parties to add subcommands to the shell commands. It is definitely complimentary to this patch, but I'll wait for this one to get committed before taking that on since it's a much bigger patch.

          Show
          aw Allen Wittenauer added a comment - I wish there was a way to dynamically add subcommands to hadoop, mapred, etc, but the code just isn't quite there yet. We can do usage now, but not actually execution. I know how to do this now in a very clean way that would even allow 3rd parties to add subcommands to the shell commands. It is definitely complimentary to this patch, but I'll wait for this one to get committed before taking that on since it's a much bigger patch.
          Hide
          raviprak Ravi Prakash added a comment -

          +1. LGTM. Please feel free to commit.

          Show
          raviprak Ravi Prakash added a comment - +1. LGTM. Please feel free to commit.
          Hide
          aw Allen Wittenauer added a comment -

          Thanks for the review!

          Committed to trunk.

          Show
          aw Allen Wittenauer added a comment - Thanks for the review! Committed to trunk.
          Hide
          hudson Hudson added a comment -

          FAILURE: Integrated in Hadoop-trunk-Commit #9492 (See https://builds.apache.org/job/Hadoop-trunk-Commit/9492/)
          HADOOP-12857. rework hadoop-tools (aw) (aw: rev 738155063e6fa3f1811e2e875e2e9611f35ef423)

          • hadoop-tools/hadoop-sls/src/main/bin/slsrun.sh
          • hadoop-tools/hadoop-streaming/pom.xml
          • hadoop-tools/hadoop-archive-logs/pom.xml
          • hadoop-tools/hadoop-datajoin/pom.xml
          • hadoop-common-project/hadoop-common/src/main/conf/hadoop-env.sh
          • hadoop-tools/hadoop-aws/pom.xml
          • hadoop-tools/hadoop-rumen/pom.xml
          • hadoop-tools/hadoop-sls/src/main/bin/rumen2sls.sh
          • hadoop-tools/hadoop-azure/src/site/markdown/index.md
          • hadoop-tools/hadoop-openstack/src/site/markdown/index.md
          • hadoop-tools/hadoop-sls/pom.xml
          • hadoop-tools/hadoop-azure/pom.xml
          • hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/index.md
          • hadoop-common-project/hadoop-common/src/test/scripts/hadoop_add_to_classpath_tools.bats
          • dev-support/bin/dist-tools-hooks-maker
          • hadoop-common-project/hadoop-common/src/main/bin/hadoop-layout.sh.example
          • hadoop-common-project/hadoop-common/src/test/scripts/hadoop_shellprofile.bats
          • hadoop-common-project/hadoop-common/src/main/bin/hadoop
          • hadoop-common-project/hadoop-common/src/test/scripts/hadoop_entry_tests.bats
          • hadoop-tools/hadoop-gridmix/pom.xml
          • hadoop-tools/hadoop-openstack/pom.xml
          • hadoop-common-project/hadoop-common/src/test/scripts/hadoop_bootstrap.bats
          • hadoop-hdfs-project/hadoop-hdfs/src/main/bin/hdfs
          • hadoop-tools/hadoop-extras/pom.xml
          • hadoop-common-project/hadoop-common/src/main/bin/hadoop-functions.sh
          • hadoop-common-project/hadoop-common/src/test/scripts/hadoop_add_to_classpath_toolspath.bats
          • hadoop-tools/hadoop-archives/pom.xml
          • hadoop-tools/hadoop-kafka/pom.xml
          • hadoop-yarn-project/hadoop-yarn/bin/yarn
          • hadoop-tools/hadoop-distcp/pom.xml
          • hadoop-mapreduce-project/bin/mapred
          • hadoop-common-project/hadoop-common/src/test/scripts/hadoop_basic_init.bats
          • hadoop-dist/pom.xml
          Show
          hudson Hudson added a comment - FAILURE: Integrated in Hadoop-trunk-Commit #9492 (See https://builds.apache.org/job/Hadoop-trunk-Commit/9492/ ) HADOOP-12857 . rework hadoop-tools (aw) (aw: rev 738155063e6fa3f1811e2e875e2e9611f35ef423) hadoop-tools/hadoop-sls/src/main/bin/slsrun.sh hadoop-tools/hadoop-streaming/pom.xml hadoop-tools/hadoop-archive-logs/pom.xml hadoop-tools/hadoop-datajoin/pom.xml hadoop-common-project/hadoop-common/src/main/conf/hadoop-env.sh hadoop-tools/hadoop-aws/pom.xml hadoop-tools/hadoop-rumen/pom.xml hadoop-tools/hadoop-sls/src/main/bin/rumen2sls.sh hadoop-tools/hadoop-azure/src/site/markdown/index.md hadoop-tools/hadoop-openstack/src/site/markdown/index.md hadoop-tools/hadoop-sls/pom.xml hadoop-tools/hadoop-azure/pom.xml hadoop-tools/hadoop-aws/src/site/markdown/tools/hadoop-aws/index.md hadoop-common-project/hadoop-common/src/test/scripts/hadoop_add_to_classpath_tools.bats dev-support/bin/dist-tools-hooks-maker hadoop-common-project/hadoop-common/src/main/bin/hadoop-layout.sh.example hadoop-common-project/hadoop-common/src/test/scripts/hadoop_shellprofile.bats hadoop-common-project/hadoop-common/src/main/bin/hadoop hadoop-common-project/hadoop-common/src/test/scripts/hadoop_entry_tests.bats hadoop-tools/hadoop-gridmix/pom.xml hadoop-tools/hadoop-openstack/pom.xml hadoop-common-project/hadoop-common/src/test/scripts/hadoop_bootstrap.bats hadoop-hdfs-project/hadoop-hdfs/src/main/bin/hdfs hadoop-tools/hadoop-extras/pom.xml hadoop-common-project/hadoop-common/src/main/bin/hadoop-functions.sh hadoop-common-project/hadoop-common/src/test/scripts/hadoop_add_to_classpath_toolspath.bats hadoop-tools/hadoop-archives/pom.xml hadoop-tools/hadoop-kafka/pom.xml hadoop-yarn-project/hadoop-yarn/bin/yarn hadoop-tools/hadoop-distcp/pom.xml hadoop-mapreduce-project/bin/mapred hadoop-common-project/hadoop-common/src/test/scripts/hadoop_basic_init.bats hadoop-dist/pom.xml

            People

            • Assignee:
              aw Allen Wittenauer
              Reporter:
              aw Allen Wittenauer
            • Votes:
              0 Vote for this issue
              Watchers:
              11 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development