Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-12563

Updated utility to create/modify token files

    Details

    • Type: New Feature
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 3.0.0-alpha1
    • Fix Version/s: 3.0.0-alpha1
    • Component/s: None
    • Labels:
      None
    • Target Version/s:
    • Hadoop Flags:
      Incompatible change, Reviewed
    • Release Note:
      This feature introduces a new command called "hadoop dtutil" which lets users request and download delegation tokens with certain attributes.

      Description

      hdfs fetchdt is missing some critical features and is geared almost exclusively towards HDFS operations. Additionally, the token files that are created use Java serializations which are hard/impossible to deal with in other languages. It should be replaced with a better utility in common that can read/write protobuf-based token files, has enough flexibility to be used with other services, and offers key functionality such as append and rename. The old version file format should still be supported for backward compatibility, but will be effectively deprecated.

      A follow-on JIRA will deprecrate fetchdt.

      1. dtutil-test-out
        18 kB
        Matthew Paduano
      2. example_dtutil_commands_and_output.txt
        17 kB
        Matthew Paduano
      3. generalized_token_case.pdf
        65 kB
        Matthew Paduano
      4. HADOOP-12563.01.patch
        52 kB
        Matthew Paduano
      5. HADOOP-12563.02.patch
        52 kB
        Matthew Paduano
      6. HADOOP-12563.03.patch
        53 kB
        Matthew Paduano
      7. HADOOP-12563.04.patch
        58 kB
        Matthew Paduano
      8. HADOOP-12563.05.patch
        58 kB
        Matthew Paduano
      9. HADOOP-12563.06.patch
        59 kB
        Matthew Paduano
      10. HADOOP-12563.07.patch
        68 kB
        Allen Wittenauer
      11. HADOOP-12563.07.patch
        68 kB
        Matthew Paduano
      12. HADOOP-12563.08.patch
        69 kB
        Matthew Paduano
      13. HADOOP-12563.09.patch
        74 kB
        Matthew Paduano
      14. HADOOP-12563.10.patch
        81 kB
        Matthew Paduano
      15. HADOOP-12563.11.patch
        81 kB
        Matthew Paduano
      16. HADOOP-12563.12.patch
        82 kB
        Matthew Paduano
      17. HADOOP-12563.13.patch
        82 kB
        Matthew Paduano
      18. HADOOP-12563.14.patch
        82 kB
        Matthew Paduano
      19. HADOOP-12563.15.patch
        95 kB
        Matthew Paduano
      20. HADOOP-12563.16.patch
        95 kB
        Matthew Paduano

        Issue Links

          Activity

          Hide
          aw Allen Wittenauer added a comment - - edited

          Please re-upload the patch as 02. The testing system only looks at the last file attached.

          Show
          aw Allen Wittenauer added a comment - - edited Please re-upload the patch as 02. The testing system only looks at the last file attached.
          Hide
          mattpaduano Matthew Paduano added a comment -

          missed filename to 02

          Show
          mattpaduano Matthew Paduano added a comment - missed filename to 02
          Hide
          hadoopqa Hadoop QA added a comment -
          -1 overall



          Vote Subsystem Runtime Comment
          -1 docker 18m 4s Docker failed to build yetus/hadoop:0ca8df7.



          Subsystem Report/Notes
          JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12773290/HADOOP-12563.02.patch
          JIRA Issue HADOOP-12563
          Powered by Apache Yetus http://yetus.apache.org
          Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8100/console

          This message was automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - -1 overall Vote Subsystem Runtime Comment -1 docker 18m 4s Docker failed to build yetus/hadoop:0ca8df7. Subsystem Report/Notes JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12773290/HADOOP-12563.02.patch JIRA Issue HADOOP-12563 Powered by Apache Yetus http://yetus.apache.org Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8100/console This message was automatically generated.
          Hide
          aw Allen Wittenauer added a comment -

          Woah. That's exciting. Let me go look at Jenkins..

          Show
          aw Allen Wittenauer added a comment - Woah. That's exciting. Let me go look at Jenkins..
          Hide
          hadoopqa Hadoop QA added a comment -
          -1 overall



          Vote Subsystem Runtime Comment
          0 reexec 0m 8s docker + precommit patch detected.
          +1 @author 0m 0s The patch does not contain any @author tags.
          +1 test4tests 0m 0s The patch appears to include 2 new or modified test files.
          +1 mvninstall 8m 31s trunk passed
          +1 compile 9m 51s trunk passed with JDK v1.8.0_66
          +1 compile 10m 5s trunk passed with JDK v1.7.0_85
          +1 checkstyle 0m 17s trunk passed
          +1 mvnsite 1m 11s trunk passed
          +1 mvneclipse 0m 15s trunk passed
          +1 findbugs 2m 13s trunk passed
          +1 javadoc 1m 11s trunk passed with JDK v1.8.0_66
          +1 javadoc 1m 17s trunk passed with JDK v1.7.0_85
          +1 mvninstall 1m 48s the patch passed
          +1 compile 11m 15s the patch passed with JDK v1.8.0_66
          +1 cc 11m 15s the patch passed
          -1 javac 17m 46s root-jdk1.8.0_66 with JDK v1.8.0_66 generated 2 new issues (was 752, now 752).
          +1 javac 11m 15s the patch passed
          +1 compile 10m 27s the patch passed with JDK v1.7.0_85
          +1 cc 10m 27s the patch passed
          -1 javac 28m 14s root-jdk1.7.0_85 with JDK v1.7.0_85 generated 1 new issues (was 745, now 746).
          +1 javac 10m 27s the patch passed
          -1 checkstyle 0m 18s Patch generated 6 new checkstyle issues in hadoop-common-project/hadoop-common (total was 27, now 6).
          +1 mvnsite 1m 11s the patch passed
          +1 mvneclipse 0m 15s the patch passed
          +1 shellcheck 0m 10s There were no new shellcheck issues.
          +1 whitespace 0m 0s Patch has no whitespace issues.
          -1 findbugs 2m 24s hadoop-common-project/hadoop-common introduced 3 new FindBugs issues.
          +1 javadoc 1m 4s the patch passed with JDK v1.8.0_66
          +1 javadoc 1m 14s the patch passed with JDK v1.7.0_85
          +1 unit 10m 3s hadoop-common in the patch passed with JDK v1.8.0_66.
          -1 unit 9m 19s hadoop-common in the patch failed with JDK v1.7.0_85.
          +1 asflicense 0m 25s Patch does not generate ASF License warnings.
          86m 3s



          Reason Tests
          FindBugs module:hadoop-common-project/hadoop-common
            Format string should use %n rather than n in org.apache.hadoop.security.token.DtUtilShell.getCommandUsage() At DtUtilShell.java:rather than n in org.apache.hadoop.security.token.DtUtilShell.getCommandUsage() At DtUtilShell.java:[line 157]
            Format string should use %n rather than n in org.apache.hadoop.security.token.DtUtilShell$Print.execute() At DtUtilShell.java:rather than n in org.apache.hadoop.security.token.DtUtilShell$Print.execute() At DtUtilShell.java:[line 178]
            Format string should use %n rather than n in org.apache.hadoop.security.token.DtUtilShell$Print.execute() At DtUtilShell.java:rather than n in org.apache.hadoop.security.token.DtUtilShell$Print.execute() At DtUtilShell.java:[line 182]
          JDK v1.7.0_85 Failed junit tests hadoop.metrics2.impl.TestGangliaMetrics



          Subsystem Report/Notes
          Docker Image:yetus/hadoop:date2015-11-19
          JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12773290/HADOOP-12563.02.patch
          JIRA Issue HADOOP-12563
          Optional Tests asflicense mvnsite unit shellcheck compile javac javadoc mvninstall findbugs checkstyle cc
          uname Linux 289158e0b5a7 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
          Build tool maven
          Personality /home/jenkins/jenkins-slave/workspace/PreCommit-HADOOP-Build/patchprocess/apache-yetus-3f4279a/precommit/personality/hadoop.sh
          git revision trunk / 866dce4
          findbugs v3.0.0
          javac root-jdk1.8.0_66: https://builds.apache.org/job/PreCommit-HADOOP-Build/8106/artifact/patchprocess/diff-compile-javac-root-jdk1.8.0_66.txt
          javac root-jdk1.7.0_85: https://builds.apache.org/job/PreCommit-HADOOP-Build/8106/artifact/patchprocess/diff-compile-javac-root-jdk1.7.0_85.txt
          checkstyle https://builds.apache.org/job/PreCommit-HADOOP-Build/8106/artifact/patchprocess/diff-checkstyle-hadoop-common-project_hadoop-common.txt
          shellcheck v0.4.1
          findbugs https://builds.apache.org/job/PreCommit-HADOOP-Build/8106/artifact/patchprocess/new-findbugs-hadoop-common-project_hadoop-common.html
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8106/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_85.txt
          unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/8106/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_85.txt
          JDK v1.7.0_85 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/8106/testReport/
          modules C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common
          Max memory used 76MB
          Powered by Apache Yetus http://yetus.apache.org
          Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8106/console

          This message was automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - -1 overall Vote Subsystem Runtime Comment 0 reexec 0m 8s docker + precommit patch detected. +1 @author 0m 0s The patch does not contain any @author tags. +1 test4tests 0m 0s The patch appears to include 2 new or modified test files. +1 mvninstall 8m 31s trunk passed +1 compile 9m 51s trunk passed with JDK v1.8.0_66 +1 compile 10m 5s trunk passed with JDK v1.7.0_85 +1 checkstyle 0m 17s trunk passed +1 mvnsite 1m 11s trunk passed +1 mvneclipse 0m 15s trunk passed +1 findbugs 2m 13s trunk passed +1 javadoc 1m 11s trunk passed with JDK v1.8.0_66 +1 javadoc 1m 17s trunk passed with JDK v1.7.0_85 +1 mvninstall 1m 48s the patch passed +1 compile 11m 15s the patch passed with JDK v1.8.0_66 +1 cc 11m 15s the patch passed -1 javac 17m 46s root-jdk1.8.0_66 with JDK v1.8.0_66 generated 2 new issues (was 752, now 752). +1 javac 11m 15s the patch passed +1 compile 10m 27s the patch passed with JDK v1.7.0_85 +1 cc 10m 27s the patch passed -1 javac 28m 14s root-jdk1.7.0_85 with JDK v1.7.0_85 generated 1 new issues (was 745, now 746). +1 javac 10m 27s the patch passed -1 checkstyle 0m 18s Patch generated 6 new checkstyle issues in hadoop-common-project/hadoop-common (total was 27, now 6). +1 mvnsite 1m 11s the patch passed +1 mvneclipse 0m 15s the patch passed +1 shellcheck 0m 10s There were no new shellcheck issues. +1 whitespace 0m 0s Patch has no whitespace issues. -1 findbugs 2m 24s hadoop-common-project/hadoop-common introduced 3 new FindBugs issues. +1 javadoc 1m 4s the patch passed with JDK v1.8.0_66 +1 javadoc 1m 14s the patch passed with JDK v1.7.0_85 +1 unit 10m 3s hadoop-common in the patch passed with JDK v1.8.0_66. -1 unit 9m 19s hadoop-common in the patch failed with JDK v1.7.0_85. +1 asflicense 0m 25s Patch does not generate ASF License warnings. 86m 3s Reason Tests FindBugs module:hadoop-common-project/hadoop-common   Format string should use %n rather than n in org.apache.hadoop.security.token.DtUtilShell.getCommandUsage() At DtUtilShell.java:rather than n in org.apache.hadoop.security.token.DtUtilShell.getCommandUsage() At DtUtilShell.java: [line 157]   Format string should use %n rather than n in org.apache.hadoop.security.token.DtUtilShell$Print.execute() At DtUtilShell.java:rather than n in org.apache.hadoop.security.token.DtUtilShell$Print.execute() At DtUtilShell.java: [line 178]   Format string should use %n rather than n in org.apache.hadoop.security.token.DtUtilShell$Print.execute() At DtUtilShell.java:rather than n in org.apache.hadoop.security.token.DtUtilShell$Print.execute() At DtUtilShell.java: [line 182] JDK v1.7.0_85 Failed junit tests hadoop.metrics2.impl.TestGangliaMetrics Subsystem Report/Notes Docker Image:yetus/hadoop:date2015-11-19 JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12773290/HADOOP-12563.02.patch JIRA Issue HADOOP-12563 Optional Tests asflicense mvnsite unit shellcheck compile javac javadoc mvninstall findbugs checkstyle cc uname Linux 289158e0b5a7 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux Build tool maven Personality /home/jenkins/jenkins-slave/workspace/PreCommit-HADOOP-Build/patchprocess/apache-yetus-3f4279a/precommit/personality/hadoop.sh git revision trunk / 866dce4 findbugs v3.0.0 javac root-jdk1.8.0_66: https://builds.apache.org/job/PreCommit-HADOOP-Build/8106/artifact/patchprocess/diff-compile-javac-root-jdk1.8.0_66.txt javac root-jdk1.7.0_85: https://builds.apache.org/job/PreCommit-HADOOP-Build/8106/artifact/patchprocess/diff-compile-javac-root-jdk1.7.0_85.txt checkstyle https://builds.apache.org/job/PreCommit-HADOOP-Build/8106/artifact/patchprocess/diff-checkstyle-hadoop-common-project_hadoop-common.txt shellcheck v0.4.1 findbugs https://builds.apache.org/job/PreCommit-HADOOP-Build/8106/artifact/patchprocess/new-findbugs-hadoop-common-project_hadoop-common.html unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8106/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_85.txt unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/8106/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_85.txt JDK v1.7.0_85 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/8106/testReport/ modules C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common Max memory used 76MB Powered by Apache Yetus http://yetus.apache.org Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8106/console This message was automatically generated.
          Hide
          hadoopqa Hadoop QA added a comment -
          -1 overall



          Vote Subsystem Runtime Comment
          0 reexec 0m 0s Docker mode activated.
          +1 @author 0m 0s The patch does not contain any @author tags.
          +1 test4tests 0m 0s The patch appears to include 2 new or modified test files.
          +1 mvninstall 11m 26s trunk passed
          +1 compile 15m 57s trunk passed with JDK v1.8.0_66
          +1 compile 12m 47s trunk passed with JDK v1.7.0_85
          +1 checkstyle 0m 24s trunk passed
          +1 mvnsite 1m 30s trunk passed
          +1 mvneclipse 0m 17s trunk passed
          -1 findbugs 2m 45s hadoop-common-project/hadoop-common in trunk has 1 extant Findbugs warnings.
          +1 javadoc 1m 30s trunk passed with JDK v1.8.0_66
          +1 javadoc 1m 31s trunk passed with JDK v1.7.0_85
          +1 mvninstall 2m 0s the patch passed
          +1 compile 15m 51s the patch passed with JDK v1.8.0_66
          +1 cc 15m 51s the patch passed
          +1 javac 15m 51s the patch passed
          +1 compile 13m 17s the patch passed with JDK v1.7.0_85
          +1 cc 13m 17s the patch passed
          +1 javac 13m 17s the patch passed
          -1 checkstyle 0m 26s Patch generated 6 new checkstyle issues in hadoop-common-project/hadoop-common (total was 27, now 6).
          +1 mvnsite 1m 47s the patch passed
          +1 mvneclipse 0m 21s the patch passed
          +1 shellcheck 0m 12s There were no new shellcheck issues.
          +1 whitespace 0m 0s Patch has no whitespace issues.
          +1 findbugs 3m 19s the patch passed
          +1 javadoc 1m 53s the patch passed with JDK v1.8.0_66
          +1 javadoc 1m 46s the patch passed with JDK v1.7.0_85
          -1 unit 15m 30s hadoop-common in the patch failed with JDK v1.8.0_66.
          -1 unit 13m 58s hadoop-common in the patch failed with JDK v1.7.0_85.
          +1 asflicense 0m 38s Patch does not generate ASF License warnings.
          121m 1s



          Reason Tests
          JDK v1.8.0_66 Failed junit tests hadoop.fs.shell.find.TestPrint
            hadoop.fs.shell.find.TestPrint0
            hadoop.test.TestTimedOutTestsListener
            hadoop.fs.shell.find.TestIname
            hadoop.fs.shell.find.TestName
            hadoop.fs.shell.find.TestFind
            hadoop.fs.permission.TestFsPermission
            hadoop.ipc.TestRPCWaitForProxy
          JDK v1.7.0_85 Failed junit tests hadoop.fs.TestLocalFsFCStatistics
            hadoop.ipc.TestDecayRpcScheduler
            hadoop.fs.shell.find.TestPrint
            hadoop.fs.shell.find.TestPrint0
            hadoop.fs.shell.find.TestIname
            hadoop.fs.shell.find.TestName
            hadoop.fs.shell.find.TestFind
            hadoop.fs.permission.TestFsPermission
            hadoop.ipc.TestRPCWaitForProxy



          Subsystem Report/Notes
          Docker Image:yetus/hadoop:0ca8df7
          JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12774392/HADOOP-12563.03.patch
          JIRA Issue HADOOP-12563
          Optional Tests asflicense mvnsite unit shellcheck compile javac javadoc mvninstall findbugs checkstyle cc
          uname Linux d3aed82f6e2e 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
          Build tool maven
          Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh
          git revision trunk / 95d5227
          findbugs v3.0.0
          findbugs https://builds.apache.org/job/PreCommit-HADOOP-Build/8157/artifact/patchprocess/branch-findbugs-hadoop-common-project_hadoop-common-warnings.html
          checkstyle https://builds.apache.org/job/PreCommit-HADOOP-Build/8157/artifact/patchprocess/diff-checkstyle-hadoop-common-project_hadoop-common.txt
          shellcheck v0.4.1
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8157/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_66.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8157/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_85.txt
          unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/8157/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_66.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8157/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_85.txt
          JDK v1.7.0_85 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/8157/testReport/
          modules C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common
          Max memory used 76MB
          Powered by Apache Yetus http://yetus.apache.org
          Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8157/console

          This message was automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - -1 overall Vote Subsystem Runtime Comment 0 reexec 0m 0s Docker mode activated. +1 @author 0m 0s The patch does not contain any @author tags. +1 test4tests 0m 0s The patch appears to include 2 new or modified test files. +1 mvninstall 11m 26s trunk passed +1 compile 15m 57s trunk passed with JDK v1.8.0_66 +1 compile 12m 47s trunk passed with JDK v1.7.0_85 +1 checkstyle 0m 24s trunk passed +1 mvnsite 1m 30s trunk passed +1 mvneclipse 0m 17s trunk passed -1 findbugs 2m 45s hadoop-common-project/hadoop-common in trunk has 1 extant Findbugs warnings. +1 javadoc 1m 30s trunk passed with JDK v1.8.0_66 +1 javadoc 1m 31s trunk passed with JDK v1.7.0_85 +1 mvninstall 2m 0s the patch passed +1 compile 15m 51s the patch passed with JDK v1.8.0_66 +1 cc 15m 51s the patch passed +1 javac 15m 51s the patch passed +1 compile 13m 17s the patch passed with JDK v1.7.0_85 +1 cc 13m 17s the patch passed +1 javac 13m 17s the patch passed -1 checkstyle 0m 26s Patch generated 6 new checkstyle issues in hadoop-common-project/hadoop-common (total was 27, now 6). +1 mvnsite 1m 47s the patch passed +1 mvneclipse 0m 21s the patch passed +1 shellcheck 0m 12s There were no new shellcheck issues. +1 whitespace 0m 0s Patch has no whitespace issues. +1 findbugs 3m 19s the patch passed +1 javadoc 1m 53s the patch passed with JDK v1.8.0_66 +1 javadoc 1m 46s the patch passed with JDK v1.7.0_85 -1 unit 15m 30s hadoop-common in the patch failed with JDK v1.8.0_66. -1 unit 13m 58s hadoop-common in the patch failed with JDK v1.7.0_85. +1 asflicense 0m 38s Patch does not generate ASF License warnings. 121m 1s Reason Tests JDK v1.8.0_66 Failed junit tests hadoop.fs.shell.find.TestPrint   hadoop.fs.shell.find.TestPrint0   hadoop.test.TestTimedOutTestsListener   hadoop.fs.shell.find.TestIname   hadoop.fs.shell.find.TestName   hadoop.fs.shell.find.TestFind   hadoop.fs.permission.TestFsPermission   hadoop.ipc.TestRPCWaitForProxy JDK v1.7.0_85 Failed junit tests hadoop.fs.TestLocalFsFCStatistics   hadoop.ipc.TestDecayRpcScheduler   hadoop.fs.shell.find.TestPrint   hadoop.fs.shell.find.TestPrint0   hadoop.fs.shell.find.TestIname   hadoop.fs.shell.find.TestName   hadoop.fs.shell.find.TestFind   hadoop.fs.permission.TestFsPermission   hadoop.ipc.TestRPCWaitForProxy Subsystem Report/Notes Docker Image:yetus/hadoop:0ca8df7 JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12774392/HADOOP-12563.03.patch JIRA Issue HADOOP-12563 Optional Tests asflicense mvnsite unit shellcheck compile javac javadoc mvninstall findbugs checkstyle cc uname Linux d3aed82f6e2e 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux Build tool maven Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh git revision trunk / 95d5227 findbugs v3.0.0 findbugs https://builds.apache.org/job/PreCommit-HADOOP-Build/8157/artifact/patchprocess/branch-findbugs-hadoop-common-project_hadoop-common-warnings.html checkstyle https://builds.apache.org/job/PreCommit-HADOOP-Build/8157/artifact/patchprocess/diff-checkstyle-hadoop-common-project_hadoop-common.txt shellcheck v0.4.1 unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8157/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_66.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8157/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_85.txt unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/8157/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_66.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8157/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_85.txt JDK v1.7.0_85 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/8157/testReport/ modules C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common Max memory used 76MB Powered by Apache Yetus http://yetus.apache.org Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8157/console This message was automatically generated.
          Hide
          hadoopqa Hadoop QA added a comment -
          -1 overall



          Vote Subsystem Runtime Comment
          0 reexec 0m 0s Docker mode activated.
          +1 @author 0m 0s The patch does not contain any @author tags.
          +1 test4tests 0m 0s The patch appears to include 2 new or modified test files.
          +1 mvninstall 8m 29s trunk passed
          +1 compile 9m 19s trunk passed with JDK v1.8.0_66
          +1 compile 9m 37s trunk passed with JDK v1.7.0_91
          +1 checkstyle 1m 2s trunk passed
          +1 mvnsite 2m 7s trunk passed
          +1 mvneclipse 0m 28s trunk passed
          +1 findbugs 3m 55s trunk passed
          +1 javadoc 2m 13s trunk passed with JDK v1.8.0_66
          +1 javadoc 3m 4s trunk passed with JDK v1.7.0_91
          +1 mvninstall 2m 35s the patch passed
          +1 compile 9m 29s the patch passed with JDK v1.8.0_66
          +1 cc 9m 29s the patch passed
          +1 javac 9m 29s the patch passed
          +1 compile 10m 0s the patch passed with JDK v1.7.0_91
          +1 cc 10m 0s the patch passed
          +1 javac 10m 0s the patch passed
          -1 checkstyle 1m 30s Patch generated 8 new checkstyle issues in root (total was 27, now 8).
          +1 mvnsite 2m 4s the patch passed
          +1 mvneclipse 0m 28s the patch passed
          +1 shellcheck 0m 9s There were no new shellcheck issues.
          +1 whitespace 0m 0s Patch has no whitespace issues.
          -1 findbugs 2m 15s hadoop-hdfs-project/hadoop-hdfs introduced 1 new FindBugs issues.
          +1 javadoc 2m 12s the patch passed with JDK v1.8.0_66
          +1 javadoc 3m 8s the patch passed with JDK v1.7.0_91
          +1 unit 9m 21s hadoop-common in the patch passed with JDK v1.8.0_66.
          -1 unit 57m 5s hadoop-hdfs in the patch failed with JDK v1.8.0_66.
          +1 unit 9m 25s hadoop-common in the patch passed with JDK v1.7.0_91.
          -1 unit 60m 51s hadoop-hdfs in the patch failed with JDK v1.7.0_91.
          -1 asflicense 0m 24s Patch generated 58 ASF License warnings.
          230m 27s



          Reason Tests
          FindBugs module:hadoop-hdfs-project/hadoop-hdfs
            Dead store to r in org.apache.hadoop.hdfs.HdfsDtFetcher.getDelegationToken(String, String) At HdfsDtFetcher.java:org.apache.hadoop.hdfs.HdfsDtFetcher.getDelegationToken(String, String) At HdfsDtFetcher.java:[line 52]
          JDK v1.8.0_66 Failed junit tests hadoop.hdfs.TestDFSStripedOutputStreamWithFailure
            hadoop.hdfs.server.balancer.TestBalancerWithMultipleNameNodes
            hadoop.hdfs.server.namenode.TestAddBlockRetry
          JDK v1.7.0_91 Failed junit tests hadoop.hdfs.server.namenode.ha.TestDFSUpgradeWithHA
            hadoop.hdfs.server.datanode.TestDirectoryScanner



          Subsystem Report/Notes
          Docker Image:yetus/hadoop:0ca8df7
          JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12776414/HADOOP-12563.04.patch
          JIRA Issue HADOOP-12563
          Optional Tests asflicense mvnsite unit shellcheck compile javac javadoc mvninstall findbugs checkstyle cc
          uname Linux 8730992730d7 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
          Build tool maven
          Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh
          git revision trunk / 7e47151
          findbugs v3.0.0
          checkstyle https://builds.apache.org/job/PreCommit-HADOOP-Build/8209/artifact/patchprocess/diff-checkstyle-root.txt
          shellcheck v0.4.1
          findbugs https://builds.apache.org/job/PreCommit-HADOOP-Build/8209/artifact/patchprocess/new-findbugs-hadoop-hdfs-project_hadoop-hdfs.html
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8209/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8209/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_91.txt
          unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/8209/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8209/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_91.txt
          JDK v1.7.0_91 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/8209/testReport/
          asflicense https://builds.apache.org/job/PreCommit-HADOOP-Build/8209/artifact/patchprocess/patch-asflicense-problems.txt
          modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: .
          Max memory used 76MB
          Powered by Apache Yetus http://yetus.apache.org
          Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8209/console

          This message was automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - -1 overall Vote Subsystem Runtime Comment 0 reexec 0m 0s Docker mode activated. +1 @author 0m 0s The patch does not contain any @author tags. +1 test4tests 0m 0s The patch appears to include 2 new or modified test files. +1 mvninstall 8m 29s trunk passed +1 compile 9m 19s trunk passed with JDK v1.8.0_66 +1 compile 9m 37s trunk passed with JDK v1.7.0_91 +1 checkstyle 1m 2s trunk passed +1 mvnsite 2m 7s trunk passed +1 mvneclipse 0m 28s trunk passed +1 findbugs 3m 55s trunk passed +1 javadoc 2m 13s trunk passed with JDK v1.8.0_66 +1 javadoc 3m 4s trunk passed with JDK v1.7.0_91 +1 mvninstall 2m 35s the patch passed +1 compile 9m 29s the patch passed with JDK v1.8.0_66 +1 cc 9m 29s the patch passed +1 javac 9m 29s the patch passed +1 compile 10m 0s the patch passed with JDK v1.7.0_91 +1 cc 10m 0s the patch passed +1 javac 10m 0s the patch passed -1 checkstyle 1m 30s Patch generated 8 new checkstyle issues in root (total was 27, now 8). +1 mvnsite 2m 4s the patch passed +1 mvneclipse 0m 28s the patch passed +1 shellcheck 0m 9s There were no new shellcheck issues. +1 whitespace 0m 0s Patch has no whitespace issues. -1 findbugs 2m 15s hadoop-hdfs-project/hadoop-hdfs introduced 1 new FindBugs issues. +1 javadoc 2m 12s the patch passed with JDK v1.8.0_66 +1 javadoc 3m 8s the patch passed with JDK v1.7.0_91 +1 unit 9m 21s hadoop-common in the patch passed with JDK v1.8.0_66. -1 unit 57m 5s hadoop-hdfs in the patch failed with JDK v1.8.0_66. +1 unit 9m 25s hadoop-common in the patch passed with JDK v1.7.0_91. -1 unit 60m 51s hadoop-hdfs in the patch failed with JDK v1.7.0_91. -1 asflicense 0m 24s Patch generated 58 ASF License warnings. 230m 27s Reason Tests FindBugs module:hadoop-hdfs-project/hadoop-hdfs   Dead store to r in org.apache.hadoop.hdfs.HdfsDtFetcher.getDelegationToken(String, String) At HdfsDtFetcher.java:org.apache.hadoop.hdfs.HdfsDtFetcher.getDelegationToken(String, String) At HdfsDtFetcher.java: [line 52] JDK v1.8.0_66 Failed junit tests hadoop.hdfs.TestDFSStripedOutputStreamWithFailure   hadoop.hdfs.server.balancer.TestBalancerWithMultipleNameNodes   hadoop.hdfs.server.namenode.TestAddBlockRetry JDK v1.7.0_91 Failed junit tests hadoop.hdfs.server.namenode.ha.TestDFSUpgradeWithHA   hadoop.hdfs.server.datanode.TestDirectoryScanner Subsystem Report/Notes Docker Image:yetus/hadoop:0ca8df7 JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12776414/HADOOP-12563.04.patch JIRA Issue HADOOP-12563 Optional Tests asflicense mvnsite unit shellcheck compile javac javadoc mvninstall findbugs checkstyle cc uname Linux 8730992730d7 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux Build tool maven Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh git revision trunk / 7e47151 findbugs v3.0.0 checkstyle https://builds.apache.org/job/PreCommit-HADOOP-Build/8209/artifact/patchprocess/diff-checkstyle-root.txt shellcheck v0.4.1 findbugs https://builds.apache.org/job/PreCommit-HADOOP-Build/8209/artifact/patchprocess/new-findbugs-hadoop-hdfs-project_hadoop-hdfs.html unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8209/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8209/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_91.txt unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/8209/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8209/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_91.txt JDK v1.7.0_91 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/8209/testReport/ asflicense https://builds.apache.org/job/PreCommit-HADOOP-Build/8209/artifact/patchprocess/patch-asflicense-problems.txt modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: . Max memory used 76MB Powered by Apache Yetus http://yetus.apache.org Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8209/console This message was automatically generated.
          Hide
          aw Allen Wittenauer added a comment -

          Ping Owen O'Malley to help review this.

          I haven't had a chance to apply and execute, but some feedback based upon visual inspection:

          1) In hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/Credentials.java

          @InterfaceAudience.LimitedPrivate({"HDFS", "MapReduce"})
          

          Not part of this patch, but clearly wrong nonetheless especially with YARN-4435 in the pipeline. We should update it to include YARN while we're here.

          2)
          writeLegacyTokenStorageFile, etc.

          I think I'd rather see these called something with version 0 or java serialization or something else. This way if there is ever a version 2 (we drop protobuf?), we're covered. Bonus points if we could somehow tie the dtutil -format option to the methods and version.

          3) TestDtUtilShell.java:
          System.getProperty("test.build.data", "/tmp"), "TestDtUtilShell");

          Let's set this to target/ instead of /tmp to be less racy with multiple unit tests running on the same machine.

          Thanks for fixing the service name in the usage.

          Show
          aw Allen Wittenauer added a comment - Ping Owen O'Malley to help review this. I haven't had a chance to apply and execute, but some feedback based upon visual inspection: 1) In hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/Credentials.java @InterfaceAudience.LimitedPrivate({ "HDFS" , "MapReduce" }) Not part of this patch, but clearly wrong nonetheless especially with YARN-4435 in the pipeline. We should update it to include YARN while we're here. 2) writeLegacyTokenStorageFile, etc. I think I'd rather see these called something with version 0 or java serialization or something else. This way if there is ever a version 2 (we drop protobuf?), we're covered. Bonus points if we could somehow tie the dtutil -format option to the methods and version. 3) TestDtUtilShell.java: System.getProperty("test.build.data", "/tmp"), "TestDtUtilShell"); Let's set this to target/ instead of /tmp to be less racy with multiple unit tests running on the same machine. Thanks for fixing the service name in the usage.
          Hide
          daryn Daryn Sharp added a comment -

          Skimmed the patch because it looks interesting!

          Please don't use getServiceName and getDelegationToken as your interface. It won't work for multi-token services. There's a reason why the filesystem javadoc refers to using addDelegationTokens. A compound filesystem like ViewFs requires obtaining multiple tokens. Fetching a RM token typically involves also implicitly acquiring a JHS or AHS token.

          You also cannot assume to know the alias that will be used by a provider which is actually impossible when n-many tokens may be returned.

          It would be great if you had something like a -fs option so every custom fs doesn't need to register its scheme when path.getFileSystem(conf).addDelegationTokens(....) would handle all scenarios.

          Show
          daryn Daryn Sharp added a comment - Skimmed the patch because it looks interesting! Please don't use getServiceName and getDelegationToken as your interface. It won't work for multi-token services. There's a reason why the filesystem javadoc refers to using addDelegationTokens. A compound filesystem like ViewFs requires obtaining multiple tokens. Fetching a RM token typically involves also implicitly acquiring a JHS or AHS token. You also cannot assume to know the alias that will be used by a provider which is actually impossible when n-many tokens may be returned. It would be great if you had something like a -fs option so every custom fs doesn't need to register its scheme when path.getFileSystem(conf).addDelegationTokens(....) would handle all scenarios.
          Hide
          aw Allen Wittenauer added a comment -

          Keep in mind that we're also thinking about YARN, etc so this won't be a filesystem specific interface....

          Show
          aw Allen Wittenauer added a comment - Keep in mind that we're also thinking about YARN, etc so this won't be a filesystem specific interface....
          Hide
          hadoopqa Hadoop QA added a comment -
          -1 overall



          Vote Subsystem Runtime Comment
          0 reexec 0m 0s Docker mode activated.
          +1 @author 0m 0s The patch does not contain any @author tags.
          +1 test4tests 0m 0s The patch appears to include 2 new or modified test files.
          +1 mvninstall 21m 14s trunk passed
          +1 compile 28m 46s trunk passed with JDK v1.8.0_66
          +1 compile 17m 27s trunk passed with JDK v1.7.0_91
          +1 checkstyle 1m 48s trunk passed
          +1 mvnsite 3m 48s trunk passed
          +1 mvneclipse 0m 48s trunk passed
          +1 findbugs 7m 25s trunk passed
          +1 javadoc 4m 40s trunk passed with JDK v1.8.0_66
          +1 javadoc 5m 50s trunk passed with JDK v1.7.0_91
          -1 mvninstall 2m 24s hadoop-common in the patch failed.
          -1 mvninstall 0m 52s hadoop-hdfs in the patch failed.
          +1 compile 23m 55s the patch passed with JDK v1.8.0_66
          +1 cc 23m 55s the patch passed
          +1 javac 23m 55s the patch passed
          +1 compile 17m 21s the patch passed with JDK v1.7.0_91
          +1 cc 17m 21s the patch passed
          -1 javac 58m 49s root-jdk1.7.0_91 with JDK v1.7.0_91 generated 4 new issues (was 729, now 729).
          +1 javac 17m 21s the patch passed
          -1 checkstyle 1m 48s Patch generated 1 new checkstyle issues in root (total was 27, now 1).
          -1 mvnsite 0m 58s hadoop-hdfs in the patch failed.
          +1 mvneclipse 0m 50s the patch passed
          +1 shellcheck 0m 13s There were no new shellcheck issues.
          +1 whitespace 0m 0s Patch has no whitespace issues.
          -1 findbugs 0m 48s hadoop-hdfs in the patch failed.
          -1 javadoc 9m 32s hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66 with JDK v1.8.0_66 generated 2 new issues (was 7, now 9).
          +1 javadoc 4m 42s the patch passed with JDK v1.8.0_66
          -1 javadoc 15m 52s hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_91 with JDK v1.7.0_91 generated 2 new issues (was 7, now 9).
          +1 javadoc 5m 45s the patch passed with JDK v1.7.0_91
          -1 unit 18m 27s hadoop-common in the patch failed with JDK v1.8.0_66.
          -1 unit 1m 9s hadoop-hdfs in the patch failed with JDK v1.8.0_66.
          -1 unit 16m 2s hadoop-common in the patch failed with JDK v1.7.0_91.
          -1 unit 0m 48s hadoop-hdfs in the patch failed with JDK v1.7.0_91.
          +1 asflicense 0m 38s Patch does not generate ASF License warnings.
          196m 59s



          Reason Tests
          JDK v1.8.0_66 Failed junit tests hadoop.fs.TestLocalFsFCStatistics
            hadoop.fs.shell.find.TestPrint
            hadoop.fs.shell.find.TestPrint0
            hadoop.test.TestTimedOutTestsListener
            hadoop.fs.shell.find.TestIname
            hadoop.fs.shell.find.TestName
            hadoop.fs.shell.find.TestFind
            hadoop.ipc.TestRPCWaitForProxy
          JDK v1.7.0_91 Failed junit tests hadoop.fs.TestLocalFsFCStatistics
            hadoop.fs.shell.find.TestPrint
            hadoop.fs.shell.find.TestPrint0
            hadoop.test.TestTimedOutTestsListener
            hadoop.fs.shell.find.TestIname
            hadoop.fs.shell.find.TestName
            hadoop.fs.shell.find.TestFind
            hadoop.ipc.TestRPCWaitForProxy



          Subsystem Report/Notes
          Docker Image:yetus/hadoop:0ca8df7
          JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12776938/HADOOP-12563.05.patch
          JIRA Issue HADOOP-12563
          Optional Tests asflicense mvnsite unit shellcheck compile javac javadoc mvninstall findbugs checkstyle cc
          uname Linux 782c2f80dcd9 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
          Build tool maven
          Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh
          git revision trunk / 576b569
          findbugs v3.0.0
          mvninstall https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-mvninstall-hadoop-common-project_hadoop-common.txt
          mvninstall https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-mvninstall-hadoop-hdfs-project_hadoop-hdfs.txt
          javac root-jdk1.7.0_91: https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/diff-compile-javac-root-jdk1.7.0_91.txt
          checkstyle https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/diff-checkstyle-root.txt
          mvnsite https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-mvnsite-hadoop-hdfs-project_hadoop-hdfs.txt
          shellcheck v0.4.1
          findbugs https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-findbugs-hadoop-hdfs-project_hadoop-hdfs.txt
          javadoc hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66: https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/diff-javadoc-javadoc-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66.txt
          javadoc hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_91: https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/diff-javadoc-javadoc-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_91.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_66.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_91.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_91.txt
          unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_66.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_91.txt
          JDK v1.7.0_91 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/testReport/
          modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: .
          Max memory used 76MB
          Powered by Apache Yetus 0.1.0-SNAPSHOT http://yetus.apache.org
          Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/console

          This message was automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - -1 overall Vote Subsystem Runtime Comment 0 reexec 0m 0s Docker mode activated. +1 @author 0m 0s The patch does not contain any @author tags. +1 test4tests 0m 0s The patch appears to include 2 new or modified test files. +1 mvninstall 21m 14s trunk passed +1 compile 28m 46s trunk passed with JDK v1.8.0_66 +1 compile 17m 27s trunk passed with JDK v1.7.0_91 +1 checkstyle 1m 48s trunk passed +1 mvnsite 3m 48s trunk passed +1 mvneclipse 0m 48s trunk passed +1 findbugs 7m 25s trunk passed +1 javadoc 4m 40s trunk passed with JDK v1.8.0_66 +1 javadoc 5m 50s trunk passed with JDK v1.7.0_91 -1 mvninstall 2m 24s hadoop-common in the patch failed. -1 mvninstall 0m 52s hadoop-hdfs in the patch failed. +1 compile 23m 55s the patch passed with JDK v1.8.0_66 +1 cc 23m 55s the patch passed +1 javac 23m 55s the patch passed +1 compile 17m 21s the patch passed with JDK v1.7.0_91 +1 cc 17m 21s the patch passed -1 javac 58m 49s root-jdk1.7.0_91 with JDK v1.7.0_91 generated 4 new issues (was 729, now 729). +1 javac 17m 21s the patch passed -1 checkstyle 1m 48s Patch generated 1 new checkstyle issues in root (total was 27, now 1). -1 mvnsite 0m 58s hadoop-hdfs in the patch failed. +1 mvneclipse 0m 50s the patch passed +1 shellcheck 0m 13s There were no new shellcheck issues. +1 whitespace 0m 0s Patch has no whitespace issues. -1 findbugs 0m 48s hadoop-hdfs in the patch failed. -1 javadoc 9m 32s hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66 with JDK v1.8.0_66 generated 2 new issues (was 7, now 9). +1 javadoc 4m 42s the patch passed with JDK v1.8.0_66 -1 javadoc 15m 52s hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_91 with JDK v1.7.0_91 generated 2 new issues (was 7, now 9). +1 javadoc 5m 45s the patch passed with JDK v1.7.0_91 -1 unit 18m 27s hadoop-common in the patch failed with JDK v1.8.0_66. -1 unit 1m 9s hadoop-hdfs in the patch failed with JDK v1.8.0_66. -1 unit 16m 2s hadoop-common in the patch failed with JDK v1.7.0_91. -1 unit 0m 48s hadoop-hdfs in the patch failed with JDK v1.7.0_91. +1 asflicense 0m 38s Patch does not generate ASF License warnings. 196m 59s Reason Tests JDK v1.8.0_66 Failed junit tests hadoop.fs.TestLocalFsFCStatistics   hadoop.fs.shell.find.TestPrint   hadoop.fs.shell.find.TestPrint0   hadoop.test.TestTimedOutTestsListener   hadoop.fs.shell.find.TestIname   hadoop.fs.shell.find.TestName   hadoop.fs.shell.find.TestFind   hadoop.ipc.TestRPCWaitForProxy JDK v1.7.0_91 Failed junit tests hadoop.fs.TestLocalFsFCStatistics   hadoop.fs.shell.find.TestPrint   hadoop.fs.shell.find.TestPrint0   hadoop.test.TestTimedOutTestsListener   hadoop.fs.shell.find.TestIname   hadoop.fs.shell.find.TestName   hadoop.fs.shell.find.TestFind   hadoop.ipc.TestRPCWaitForProxy Subsystem Report/Notes Docker Image:yetus/hadoop:0ca8df7 JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12776938/HADOOP-12563.05.patch JIRA Issue HADOOP-12563 Optional Tests asflicense mvnsite unit shellcheck compile javac javadoc mvninstall findbugs checkstyle cc uname Linux 782c2f80dcd9 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux Build tool maven Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh git revision trunk / 576b569 findbugs v3.0.0 mvninstall https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-mvninstall-hadoop-common-project_hadoop-common.txt mvninstall https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-mvninstall-hadoop-hdfs-project_hadoop-hdfs.txt javac root-jdk1.7.0_91: https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/diff-compile-javac-root-jdk1.7.0_91.txt checkstyle https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/diff-checkstyle-root.txt mvnsite https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-mvnsite-hadoop-hdfs-project_hadoop-hdfs.txt shellcheck v0.4.1 findbugs https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-findbugs-hadoop-hdfs-project_hadoop-hdfs.txt javadoc hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66: https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/diff-javadoc-javadoc-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66.txt javadoc hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_91: https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/diff-javadoc-javadoc-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_91.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_66.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_91.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_91.txt unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_66.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_91.txt JDK v1.7.0_91 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/testReport/ modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: . Max memory used 76MB Powered by Apache Yetus 0.1.0-SNAPSHOT http://yetus.apache.org Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8224/console This message was automatically generated.
          Hide
          mattpaduano Matthew Paduano added a comment -

          I agree with the comment about addDelegationTokens. I changed the interface in DtFetcher to
          use this more general signature and included the Credentials object in the interface so that the
          DtFetcher implementation class may have full control over how the tokens are placed into the
          credentials map.

          Re -fs and the abstraction used here: I agree that ServiceLoader<DtFetcher> is "reabstracting"
          one method (getDelegationToken) from the FileSystem layer. FileSystem et al are nicely abstracted.
          But FileSystem is not generalized to other services (e.g. Yarn). I think what is needed here is a
          generalization of that FileSystem abstraction. getServiceName, or something like it, is needed
          as a key to identify the implementation to use (this could be the generalized analog of -fs, once
          that analog exists).

          Perhaps someone can think about the proper way to generalize implementations across
          different projects/services so that new service specific tools can be more cleanly abstracted. I
          think this is a separate JIRA from this one though. This ticket changes the serialization format
          of the token files and adds the ability to add multiple tokens to a single file.

          The alias thing is a bit of a hack. We have discussed adding alias as a new field (or using
          the key from the credentials object) rather than overwriting the service field in the token. But
          in order to keep the scope of this ticket more limited, we decided to leave the behavior as it
          is for now so that we can get this new functionality rolled out. Changing that behavior will
          involve checking all the places Token is used.

          Show
          mattpaduano Matthew Paduano added a comment - I agree with the comment about addDelegationTokens. I changed the interface in DtFetcher to use this more general signature and included the Credentials object in the interface so that the DtFetcher implementation class may have full control over how the tokens are placed into the credentials map. Re -fs and the abstraction used here: I agree that ServiceLoader<DtFetcher> is "reabstracting" one method (getDelegationToken) from the FileSystem layer. FileSystem et al are nicely abstracted. But FileSystem is not generalized to other services (e.g. Yarn). I think what is needed here is a generalization of that FileSystem abstraction. getServiceName, or something like it, is needed as a key to identify the implementation to use (this could be the generalized analog of -fs, once that analog exists). Perhaps someone can think about the proper way to generalize implementations across different projects/services so that new service specific tools can be more cleanly abstracted. I think this is a separate JIRA from this one though. This ticket changes the serialization format of the token files and adds the ability to add multiple tokens to a single file. The alias thing is a bit of a hack. We have discussed adding alias as a new field (or using the key from the credentials object) rather than overwriting the service field in the token. But in order to keep the scope of this ticket more limited, we decided to leave the behavior as it is for now so that we can get this new functionality rolled out. Changing that behavior will involve checking all the places Token is used.
          Hide
          hadoopqa Hadoop QA added a comment -
          -1 overall



          Vote Subsystem Runtime Comment
          0 reexec 0m 0s Docker mode activated.
          +1 @author 0m 0s The patch does not contain any @author tags.
          +1 test4tests 0m 0s The patch appears to include 2 new or modified test files.
          +1 mvninstall 7m 24s trunk passed
          +1 compile 7m 40s trunk passed with JDK v1.8.0_66
          +1 compile 8m 45s trunk passed with JDK v1.7.0_91
          +1 checkstyle 0m 59s trunk passed
          +1 mvnsite 1m 52s trunk passed
          +1 mvneclipse 0m 27s trunk passed
          +1 findbugs 3m 37s trunk passed
          +1 javadoc 2m 2s trunk passed with JDK v1.8.0_66
          +1 javadoc 2m 47s trunk passed with JDK v1.7.0_91
          -1 mvninstall 0m 48s hadoop-hdfs in the patch failed.
          +1 compile 8m 4s the patch passed with JDK v1.8.0_66
          +1 cc 8m 4s the patch passed
          +1 javac 8m 4s the patch passed
          +1 compile 8m 40s the patch passed with JDK v1.7.0_91
          +1 cc 8m 40s the patch passed
          -1 javac 25m 55s root-jdk1.7.0_91 with JDK v1.7.0_91 generated 4 new issues (was 723, now 723).
          +1 javac 8m 40s the patch passed
          -1 checkstyle 1m 16s Patch generated 1 new checkstyle issues in root (total was 28, now 3).
          +1 mvnsite 1m 54s the patch passed
          +1 mvneclipse 0m 27s the patch passed
          +1 shellcheck 0m 8s There were no new shellcheck issues.
          +1 whitespace 0m 0s Patch has no whitespace issues.
          +1 findbugs 3m 59s the patch passed
          +1 javadoc 1m 56s the patch passed with JDK v1.8.0_66
          +1 javadoc 2m 51s the patch passed with JDK v1.7.0_91
          -1 unit 7m 6s hadoop-common in the patch failed with JDK v1.8.0_66.
          -1 unit 51m 16s hadoop-hdfs in the patch failed with JDK v1.8.0_66.
          +1 unit 8m 0s hadoop-common in the patch passed with JDK v1.7.0_91.
          +1 unit 50m 21s hadoop-hdfs in the patch passed with JDK v1.7.0_91.
          -1 asflicense 0m 19s Patch generated 58 ASF License warnings.
          185m 34s



          Reason Tests
          JDK v1.8.0_66 Failed junit tests hadoop.http.TestHttpServer
            hadoop.hdfs.server.namenode.TestCacheDirectives
            hadoop.hdfs.TestDFSUpgradeFromImage



          Subsystem Report/Notes
          Docker Image:yetus/hadoop:0ca8df7
          JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12777611/HADOOP-12563.06.patch
          JIRA Issue HADOOP-12563
          Optional Tests asflicense mvnsite unit shellcheck compile javac javadoc mvninstall findbugs checkstyle cc
          uname Linux 55898da20849 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
          Build tool maven
          Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh
          git revision trunk / 915cd6c
          findbugs v3.0.0
          mvninstall https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/artifact/patchprocess/patch-mvninstall-hadoop-hdfs-project_hadoop-hdfs.txt
          javac root-jdk1.7.0_91: https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/artifact/patchprocess/diff-compile-javac-root-jdk1.7.0_91.txt
          checkstyle https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/artifact/patchprocess/diff-checkstyle-root.txt
          shellcheck v0.4.1
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_66.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66.txt
          unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_66.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66.txt
          JDK v1.7.0_91 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/testReport/
          asflicense https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/artifact/patchprocess/patch-asflicense-problems.txt
          modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: .
          Max memory used 75MB
          Powered by Apache Yetus 0.1.0 http://yetus.apache.org
          Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/console

          This message was automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - -1 overall Vote Subsystem Runtime Comment 0 reexec 0m 0s Docker mode activated. +1 @author 0m 0s The patch does not contain any @author tags. +1 test4tests 0m 0s The patch appears to include 2 new or modified test files. +1 mvninstall 7m 24s trunk passed +1 compile 7m 40s trunk passed with JDK v1.8.0_66 +1 compile 8m 45s trunk passed with JDK v1.7.0_91 +1 checkstyle 0m 59s trunk passed +1 mvnsite 1m 52s trunk passed +1 mvneclipse 0m 27s trunk passed +1 findbugs 3m 37s trunk passed +1 javadoc 2m 2s trunk passed with JDK v1.8.0_66 +1 javadoc 2m 47s trunk passed with JDK v1.7.0_91 -1 mvninstall 0m 48s hadoop-hdfs in the patch failed. +1 compile 8m 4s the patch passed with JDK v1.8.0_66 +1 cc 8m 4s the patch passed +1 javac 8m 4s the patch passed +1 compile 8m 40s the patch passed with JDK v1.7.0_91 +1 cc 8m 40s the patch passed -1 javac 25m 55s root-jdk1.7.0_91 with JDK v1.7.0_91 generated 4 new issues (was 723, now 723). +1 javac 8m 40s the patch passed -1 checkstyle 1m 16s Patch generated 1 new checkstyle issues in root (total was 28, now 3). +1 mvnsite 1m 54s the patch passed +1 mvneclipse 0m 27s the patch passed +1 shellcheck 0m 8s There were no new shellcheck issues. +1 whitespace 0m 0s Patch has no whitespace issues. +1 findbugs 3m 59s the patch passed +1 javadoc 1m 56s the patch passed with JDK v1.8.0_66 +1 javadoc 2m 51s the patch passed with JDK v1.7.0_91 -1 unit 7m 6s hadoop-common in the patch failed with JDK v1.8.0_66. -1 unit 51m 16s hadoop-hdfs in the patch failed with JDK v1.8.0_66. +1 unit 8m 0s hadoop-common in the patch passed with JDK v1.7.0_91. +1 unit 50m 21s hadoop-hdfs in the patch passed with JDK v1.7.0_91. -1 asflicense 0m 19s Patch generated 58 ASF License warnings. 185m 34s Reason Tests JDK v1.8.0_66 Failed junit tests hadoop.http.TestHttpServer   hadoop.hdfs.server.namenode.TestCacheDirectives   hadoop.hdfs.TestDFSUpgradeFromImage Subsystem Report/Notes Docker Image:yetus/hadoop:0ca8df7 JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12777611/HADOOP-12563.06.patch JIRA Issue HADOOP-12563 Optional Tests asflicense mvnsite unit shellcheck compile javac javadoc mvninstall findbugs checkstyle cc uname Linux 55898da20849 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux Build tool maven Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh git revision trunk / 915cd6c findbugs v3.0.0 mvninstall https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/artifact/patchprocess/patch-mvninstall-hadoop-hdfs-project_hadoop-hdfs.txt javac root-jdk1.7.0_91: https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/artifact/patchprocess/diff-compile-javac-root-jdk1.7.0_91.txt checkstyle https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/artifact/patchprocess/diff-checkstyle-root.txt shellcheck v0.4.1 unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_66.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66.txt unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_66.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66.txt JDK v1.7.0_91 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/testReport/ asflicense https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/artifact/patchprocess/patch-asflicense-problems.txt modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: . Max memory used 75MB Powered by Apache Yetus 0.1.0 http://yetus.apache.org Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8236/console This message was automatically generated.
          Hide
          lmccay Larry McCay added a comment -

          I find this patch really interesting.

          It touches on some of the pain points that I have been thinking about for some time.
          I would like to see a bit more of the specific problems that are solved by this approach though.
          The attached generalized_token_usecase doc is a good start but I would like to see the addressed problems enumerated.

          I also wonder whether a token acquired through dtutil would be usable by services that can be configured to only accept this token as representation of the authentication event. Given some trust mechanism, such as SSL (even better 2 way SSL) we should be able to cryptographically verify and determine whether its issuer is from a trusted authority.

          I'm also curious about the choice of protobuf for the token rather than JWT.
          I'd like to understand the differences in portability that you see between the two.
          JWT has become a very popular format for such things.

          Show
          lmccay Larry McCay added a comment - I find this patch really interesting. It touches on some of the pain points that I have been thinking about for some time. I would like to see a bit more of the specific problems that are solved by this approach though. The attached generalized_token_usecase doc is a good start but I would like to see the addressed problems enumerated. I also wonder whether a token acquired through dtutil would be usable by services that can be configured to only accept this token as representation of the authentication event. Given some trust mechanism, such as SSL (even better 2 way SSL) we should be able to cryptographically verify and determine whether its issuer is from a trusted authority. I'm also curious about the choice of protobuf for the token rather than JWT. I'd like to understand the differences in portability that you see between the two. JWT has become a very popular format for such things.
          Hide
          aw Allen Wittenauer added a comment -

          I'm also curious about the choice of protobuf for the token rather than JWT. I'd like to understand the differences in portability that you see between the two. JWT has become a very popular format for such things.

          • extremely portable; hooks for almost every language you can think of
          • if the app is doing RPC (probably the majority case today for most DT file usage), protobuf libraries are already available
          • changing from one serialization format to another is a fairly trivial change; the content is left mostly untouched so avoid the conversation of what goes where
          • can be evolved to support more fields (e.g., service aliasing, something we've been discussing internally) as necessary

          The ability to support more than one format is part of the design here. If protobuf isn't sufficient to handle all uses cases another format could be added easily enough. e.g., there's no reason why JWT couldn't be added as a third option at a later date.

          Show
          aw Allen Wittenauer added a comment - I'm also curious about the choice of protobuf for the token rather than JWT. I'd like to understand the differences in portability that you see between the two. JWT has become a very popular format for such things. extremely portable; hooks for almost every language you can think of if the app is doing RPC (probably the majority case today for most DT file usage), protobuf libraries are already available changing from one serialization format to another is a fairly trivial change; the content is left mostly untouched so avoid the conversation of what goes where can be evolved to support more fields (e.g., service aliasing, something we've been discussing internally) as necessary The ability to support more than one format is part of the design here. If protobuf isn't sufficient to handle all uses cases another format could be added easily enough. e.g., there's no reason why JWT couldn't be added as a third option at a later date.
          Hide
          lmccay Larry McCay added a comment -

          Allen Wittenauer - thanks for the response - somehow I missed it earlier.

          The ability to have multiple formats would be great.
          There has been some other similar discussion around using JWT as a normalized authentication token.
          I'd like to dig into this ability and make sure it is accounted for in the current design.

          I envision an hinit command for authentication that results in a protected (JWT) token file that can be used for authentication.
          This is very much inline with dtutil - apart from the current token format.

          There is a filter available for use with the UIs that accepts cookies with JWT tokens available in trunk. It leverages the nimbus library for JWT support.

          So, can we talk about the ability to have different formats now or do we have to talk about adding the ability in a follow up to this?

          Thanks again!

          Show
          lmccay Larry McCay added a comment - Allen Wittenauer - thanks for the response - somehow I missed it earlier. The ability to have multiple formats would be great. There has been some other similar discussion around using JWT as a normalized authentication token. I'd like to dig into this ability and make sure it is accounted for in the current design. I envision an hinit command for authentication that results in a protected (JWT) token file that can be used for authentication. This is very much inline with dtutil - apart from the current token format. There is a filter available for use with the UIs that accepts cookies with JWT tokens available in trunk. It leverages the nimbus library for JWT support. So, can we talk about the ability to have different formats now or do we have to talk about adding the ability in a follow up to this? Thanks again!
          Hide
          hadoopqa Hadoop QA added a comment -
          -1 overall



          Vote Subsystem Runtime Comment
          0 reexec 0m 0s Docker mode activated.
          +1 @author 0m 0s The patch does not contain any @author tags.
          +1 test4tests 0m 0s The patch appears to include 2 new or modified test files.
          0 mvndep 0m 57s Maven dependency ordering for branch
          +1 mvninstall 7m 41s trunk passed
          +1 compile 6m 22s trunk passed with JDK v1.8.0_66
          +1 compile 7m 5s trunk passed with JDK v1.7.0_91
          +1 checkstyle 1m 0s trunk passed
          +1 mvnsite 1m 59s trunk passed
          +1 mvneclipse 0m 26s trunk passed
          +1 findbugs 3m 39s trunk passed
          +1 javadoc 2m 0s trunk passed with JDK v1.8.0_66
          +1 javadoc 2m 53s trunk passed with JDK v1.7.0_91
          0 mvndep 0m 17s Maven dependency ordering for patch
          +1 mvninstall 2m 20s the patch passed
          +1 compile 5m 42s the patch passed with JDK v1.8.0_66
          +1 cc 5m 42s the patch passed
          +1 javac 5m 42s the patch passed
          +1 compile 6m 40s the patch passed with JDK v1.7.0_91
          +1 cc 6m 40s the patch passed
          +1 javac 6m 40s the patch passed
          -1 checkstyle 0m 56s root: patch generated 2 new + 0 unchanged - 27 fixed = 2 total (was 27)
          +1 mvnsite 1m 51s the patch passed
          +1 mvneclipse 0m 28s the patch passed
          +1 shellcheck 0m 7s There were no new shellcheck issues.
          +1 whitespace 0m 0s Patch has no whitespace issues.
          +1 findbugs 4m 7s the patch passed
          +1 javadoc 1m 57s the patch passed with JDK v1.8.0_66
          +1 javadoc 2m 50s the patch passed with JDK v1.7.0_91
          -1 unit 7m 56s hadoop-common in the patch failed with JDK v1.8.0_66.
          -1 unit 53m 19s hadoop-hdfs in the patch failed with JDK v1.8.0_66.
          -1 unit 7m 54s hadoop-common in the patch failed with JDK v1.7.0_91.
          +1 unit 50m 48s hadoop-hdfs in the patch passed with JDK v1.7.0_91.
          +1 asflicense 0m 25s Patch does not generate ASF License warnings.
          183m 18s



          Reason Tests
          JDK v1.8.0_66 Failed junit tests hadoop.ipc.TestIPC
            hadoop.hdfs.server.datanode.TestBlockScanner
          JDK v1.7.0_91 Failed junit tests hadoop.security.ssl.TestReloadingX509TrustManager
            hadoop.ipc.TestIPC



          Subsystem Report/Notes
          Docker Image:yetus/hadoop:0ca8df7
          JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12777611/HADOOP-12563.06.patch
          JIRA Issue HADOOP-12563
          Optional Tests asflicense mvnsite unit shellcheck compile javac javadoc mvninstall findbugs checkstyle cc
          uname Linux c5e320f8f4e7 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
          Build tool maven
          Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh
          git revision trunk / da77f42
          Default Java 1.7.0_91
          Multi-JDK versions /usr/lib/jvm/java-8-oracle:1.8.0_66 /usr/lib/jvm/java-7-openjdk-amd64:1.7.0_91
          shellcheck v0.4.1
          findbugs v3.0.0
          checkstyle https://builds.apache.org/job/PreCommit-HADOOP-Build/8431/artifact/patchprocess/diff-checkstyle-root.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8431/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_66.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8431/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8431/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_91.txt
          unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/8431/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_66.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8431/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8431/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_91.txt
          JDK v1.7.0_91 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/8431/testReport/
          modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: .
          Max memory used 76MB
          Powered by Apache Yetus 0.2.0-SNAPSHOT http://yetus.apache.org
          Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8431/console

          This message was automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - -1 overall Vote Subsystem Runtime Comment 0 reexec 0m 0s Docker mode activated. +1 @author 0m 0s The patch does not contain any @author tags. +1 test4tests 0m 0s The patch appears to include 2 new or modified test files. 0 mvndep 0m 57s Maven dependency ordering for branch +1 mvninstall 7m 41s trunk passed +1 compile 6m 22s trunk passed with JDK v1.8.0_66 +1 compile 7m 5s trunk passed with JDK v1.7.0_91 +1 checkstyle 1m 0s trunk passed +1 mvnsite 1m 59s trunk passed +1 mvneclipse 0m 26s trunk passed +1 findbugs 3m 39s trunk passed +1 javadoc 2m 0s trunk passed with JDK v1.8.0_66 +1 javadoc 2m 53s trunk passed with JDK v1.7.0_91 0 mvndep 0m 17s Maven dependency ordering for patch +1 mvninstall 2m 20s the patch passed +1 compile 5m 42s the patch passed with JDK v1.8.0_66 +1 cc 5m 42s the patch passed +1 javac 5m 42s the patch passed +1 compile 6m 40s the patch passed with JDK v1.7.0_91 +1 cc 6m 40s the patch passed +1 javac 6m 40s the patch passed -1 checkstyle 0m 56s root: patch generated 2 new + 0 unchanged - 27 fixed = 2 total (was 27) +1 mvnsite 1m 51s the patch passed +1 mvneclipse 0m 28s the patch passed +1 shellcheck 0m 7s There were no new shellcheck issues. +1 whitespace 0m 0s Patch has no whitespace issues. +1 findbugs 4m 7s the patch passed +1 javadoc 1m 57s the patch passed with JDK v1.8.0_66 +1 javadoc 2m 50s the patch passed with JDK v1.7.0_91 -1 unit 7m 56s hadoop-common in the patch failed with JDK v1.8.0_66. -1 unit 53m 19s hadoop-hdfs in the patch failed with JDK v1.8.0_66. -1 unit 7m 54s hadoop-common in the patch failed with JDK v1.7.0_91. +1 unit 50m 48s hadoop-hdfs in the patch passed with JDK v1.7.0_91. +1 asflicense 0m 25s Patch does not generate ASF License warnings. 183m 18s Reason Tests JDK v1.8.0_66 Failed junit tests hadoop.ipc.TestIPC   hadoop.hdfs.server.datanode.TestBlockScanner JDK v1.7.0_91 Failed junit tests hadoop.security.ssl.TestReloadingX509TrustManager   hadoop.ipc.TestIPC Subsystem Report/Notes Docker Image:yetus/hadoop:0ca8df7 JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12777611/HADOOP-12563.06.patch JIRA Issue HADOOP-12563 Optional Tests asflicense mvnsite unit shellcheck compile javac javadoc mvninstall findbugs checkstyle cc uname Linux c5e320f8f4e7 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux Build tool maven Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh git revision trunk / da77f42 Default Java 1.7.0_91 Multi-JDK versions /usr/lib/jvm/java-8-oracle:1.8.0_66 /usr/lib/jvm/java-7-openjdk-amd64:1.7.0_91 shellcheck v0.4.1 findbugs v3.0.0 checkstyle https://builds.apache.org/job/PreCommit-HADOOP-Build/8431/artifact/patchprocess/diff-checkstyle-root.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8431/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_66.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8431/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8431/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_91.txt unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/8431/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_66.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8431/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_66.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8431/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_91.txt JDK v1.7.0_91 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/8431/testReport/ modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: . Max memory used 76MB Powered by Apache Yetus 0.2.0-SNAPSHOT http://yetus.apache.org Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8431/console This message was automatically generated.
          Hide
          aw Allen Wittenauer added a comment -

          can we talk about the ability to have different formats now or do we have to talk about adding the ability in a follow up to this?

          I'd prefer we cover them as a separate JIRA. My plan was to only commit this to trunk, since we already know that some ecosystem projects (e.g., Spark) are doing question stuff like shading the Credential class. If everyone thinks the general framework here is OK, then let's commit this and spread out/move on to further enhancements such as adding other token formats.

          (Yes, technically the changes here are "legally" compatible. But there have been enough surprising/damaging changes in branch-2 throughout it's extremely long lifetime that unless one completely disregards the users, it's unconscionable to make the situation even worse by committing this there.)

          Show
          aw Allen Wittenauer added a comment - can we talk about the ability to have different formats now or do we have to talk about adding the ability in a follow up to this? I'd prefer we cover them as a separate JIRA. My plan was to only commit this to trunk, since we already know that some ecosystem projects (e.g., Spark) are doing question stuff like shading the Credential class. If everyone thinks the general framework here is OK, then let's commit this and spread out/move on to further enhancements such as adding other token formats. (Yes, technically the changes here are "legally" compatible. But there have been enough surprising/damaging changes in branch-2 throughout it's extremely long lifetime that unless one completely disregards the users, it's unconscionable to make the situation even worse by committing this there.)
          Hide
          lmccay Larry McCay added a comment -

          That seems reasonable to me.

          Show
          lmccay Larry McCay added a comment - That seems reasonable to me.
          Hide
          aw Allen Wittenauer added a comment - - edited

          I'm still playing around a bit, but some feedback:

          • Non-existent file handling should throw a message as well as give a failure exit code:
            $ rm foo
            $ hadoop dtutil get foo
            $ echo $?
            0
            
          • fetchdt needs to default to use the java-based serialization for backward compatibility. Simple script that shows it:
          #!/bin/bash
          
          URL=hdfs://$(hostname):9000
          
          hdfs fetchdt dt-fetchdt
          hadoop dtutil get ${URL} dt-util-default
          hadoop dtutil get ${URL} -format java dt-util-java
          hadoop dtutil get ${URL} -format protobuf dt-util-proto
          ls -l dt-*
          
          Show
          aw Allen Wittenauer added a comment - - edited I'm still playing around a bit, but some feedback: Non-existent file handling should throw a message as well as give a failure exit code: $ rm foo $ hadoop dtutil get foo $ echo $? 0 fetchdt needs to default to use the java-based serialization for backward compatibility. Simple script that shows it: #!/bin/bash URL=hdfs: //$(hostname):9000 hdfs fetchdt dt-fetchdt hadoop dtutil get ${URL} dt-util- default hadoop dtutil get ${URL} -format java dt-util-java hadoop dtutil get ${URL} -format protobuf dt-util-proto ls -l dt-*
          Hide
          stevel@apache.org Steve Loughran added a comment -

          list

          when listing tokens

          1. attempt to unmarshall them, then call toString() in the value
          2. and if they are delegation tokens, print out the expiry time in a human form.

          Add a command to verify that a token for a service exists, and is currently valid.

          DtFetcher

          I really like the idea of a standard fetch interface, which is of tangible benefit to any application that wants to be able to load tokens off remote services, without having to compile in support for that service. I would point to fun Spark has to go through to get HBase and Hive tokens as an example, as well as observing that to add support for a new service (e.g. Kafka) will require some more reflection pain (and/or spark implements its own api).

          Accordingly, a use case that the underlying code must support here is: be something usable inside YARN applications itself.

          Command Shell

          if we're going to do some new entry point stuff, can that be done separately?

          In particular, the YARN-679 service launcher proposes modifying ExitUtils.ExitException to support exit codes better, and allow any raised exception to also supply an exit code. This allows apps to fail with more meaningful errors than just "-1". We can pull that little bit out into its own HADOOP-9626

          BTW, the exit code for a missing file should be 44, which is 404 compressed into one byte. Hence my proposed list of exit codes

          Show
          stevel@apache.org Steve Loughran added a comment - list when listing tokens attempt to unmarshall them, then call toString() in the value and if they are delegation tokens, print out the expiry time in a human form. Add a command to verify that a token for a service exists, and is currently valid. DtFetcher I really like the idea of a standard fetch interface, which is of tangible benefit to any application that wants to be able to load tokens off remote services, without having to compile in support for that service. I would point to fun Spark has to go through to get HBase and Hive tokens as an example, as well as observing that to add support for a new service (e.g. Kafka) will require some more reflection pain (and/or spark implements its own api). Accordingly, a use case that the underlying code must support here is: be something usable inside YARN applications itself. Command Shell if we're going to do some new entry point stuff, can that be done separately? In particular, the YARN-679 service launcher proposes modifying ExitUtils.ExitException to support exit codes better, and allow any raised exception to also supply an exit code. This allows apps to fail with more meaningful errors than just "-1". We can pull that little bit out into its own HADOOP-9626 BTW, the exit code for a missing file should be 44, which is 404 compressed into one byte. Hence my proposed list of exit codes
          Hide
          stevel@apache.org Steve Loughran added a comment -
          1. Looking at the DtFetcher interface -it needs to take a configuration. How else can it know what to work with, if there have been any runtime config options above the -site.xml values?
          1. The change to using a protobuf persistent format is significant enough it should be called out into its own patch. That doesn't mean there's anything wrong with it (it makes a lot of sense), it's just that it's a significant enough change that more people should be commenting on it.
          Show
          stevel@apache.org Steve Loughran added a comment - Looking at the DtFetcher interface -it needs to take a configuration. How else can it know what to work with, if there have been any runtime config options above the -site.xml values? The change to using a protobuf persistent format is significant enough it should be called out into its own patch. That doesn't mean there's anything wrong with it (it makes a lot of sense), it's just that it's a significant enough change that more people should be commenting on it.
          Hide
          stevel@apache.org Steve Loughran added a comment -

          SLIDER-1081 added a similar command, in order to create all the tokens needed to simulate launching slider and spark under OOzie.

          slider tokens lets you create a token file with HDFS, RM and ATS delegation tokens, and to list the contents of a file. Tokens will be created as the current user unless keytab and principal are defined.

          Looking at that code, If I were to evolve it I'd add

          1. a renew option: go through the tokens, renew them.
          2. a way to explicitly list the HDFS, webhdfs, RM, NN, .. endpoints.
          3. a way to explicitly identify the classname + endpoint of other token providers (hbase, hive, ...)

          There's a class there, CredentialUtils| https://github.com/apache/incubator-slider/blob/develop/slider-core/src/main/java/org/apache/slider/core/launch/CredentialUtils.java which contains core functions usable here and by YARN applications.

          Show
          stevel@apache.org Steve Loughran added a comment - SLIDER-1081 added a similar command, in order to create all the tokens needed to simulate launching slider and spark under OOzie. slider tokens lets you create a token file with HDFS, RM and ATS delegation tokens, and to list the contents of a file. Tokens will be created as the current user unless keytab and principal are defined. Looking at that code, If I were to evolve it I'd add a renew option: go through the tokens, renew them. a way to explicitly list the HDFS, webhdfs, RM, NN, .. endpoints. a way to explicitly identify the classname + endpoint of other token providers (hbase, hive, ...) There's a class there, CredentialUtils| https://github.com/apache/incubator-slider/blob/develop/slider-core/src/main/java/org/apache/slider/core/launch/CredentialUtils.java which contains core functions usable here and by YARN applications.
          Hide
          stevel@apache.org Steve Loughran added a comment -

          SPARK-11265 and SPARK-12241 show the how apps will benefit from a standard API for loading tokens; each service needs its own custom reflection code, something which was inadvertently broken when upgrading the hive version. To add support for new services (e.g Accumulo), Spark would currently need to patch its Client.scala class. That's not sustainable.

          Show
          stevel@apache.org Steve Loughran added a comment - SPARK-11265 and SPARK-12241 show the how apps will benefit from a standard API for loading tokens; each service needs its own custom reflection code, something which was inadvertently broken when upgrading the hive version. To add support for new services (e.g Accumulo), Spark would currently need to patch its Client.scala class. That's not sustainable.
          Hide
          mattpaduano Matthew Paduano added a comment -

          additions:

          • better unit test coverage
          • make fetchdt use legacy output API
          • improve print function

          tested via:

          • unittests
          • test-patch
          • manual test of fetchdt token
          • manual test of aliased token file from external host, e.g.
            on host A (10.0.2.28):
            hadoop dtutil get hdfs://localhost:9000/ -alias 10.0.2.28:9000 manual_test_alias
            ssh -L 10.0.2.28:9000:127.0.0.1:9000 10.0.2.28

          on host B (e.g. 10.0.2.24):
          scp 10.0.2.28:/home/mattp/manual_test_alias .
          HADOOP_TOKEN_FILE_LOCATION=/Users/mattp/dev/HADOOP/hadoop/manual_test_alias hadoop fs -ls hdfs://10.0.2.28:9000/user

          Show
          mattpaduano Matthew Paduano added a comment - additions: better unit test coverage make fetchdt use legacy output API improve print function tested via: unittests test-patch manual test of fetchdt token manual test of aliased token file from external host, e.g. on host A (10.0.2.28): hadoop dtutil get hdfs://localhost:9000/ -alias 10.0.2.28:9000 manual_test_alias ssh -L 10.0.2.28:9000:127.0.0.1:9000 10.0.2.28 on host B (e.g. 10.0.2.24): scp 10.0.2.28:/home/mattp/manual_test_alias . HADOOP_TOKEN_FILE_LOCATION=/Users/mattp/dev/HADOOP/hadoop/manual_test_alias hadoop fs -ls hdfs://10.0.2.28:9000/user
          Hide
          mattpaduano Matthew Paduano added a comment -

          dtutil-test-out is a capture (via set -x) of testing dtutil against a dev hadoop instance. useful as manual test and example syntax and commands.

          Show
          mattpaduano Matthew Paduano added a comment - dtutil-test-out is a capture (via set -x) of testing dtutil against a dev hadoop instance. useful as manual test and example syntax and commands.
          Hide
          aw Allen Wittenauer added a comment -

          (re-attaching 07 so that precommit can see it since it only looks at the last file...)

          Show
          aw Allen Wittenauer added a comment - (re-attaching 07 so that precommit can see it since it only looks at the last file...)
          Hide
          hadoopqa Hadoop QA added a comment -
          -1 overall



          Vote Subsystem Runtime Comment
          0 reexec 0m 9s Docker mode activated.
          0 shelldocs 0m 4s Shelldocs was not available.
          +1 @author 0m 0s The patch does not contain any @author tags.
          +1 test4tests 0m 0s The patch appears to include 4 new or modified test files.
          0 mvndep 0m 14s Maven dependency ordering for branch
          +1 mvninstall 6m 35s trunk passed
          +1 compile 5m 57s trunk passed with JDK v1.8.0_77
          +1 compile 6m 44s trunk passed with JDK v1.7.0_95
          +1 checkstyle 1m 3s trunk passed
          +1 mvnsite 1m 49s trunk passed
          +1 mvneclipse 0m 28s trunk passed
          +1 findbugs 3m 30s trunk passed
          +1 javadoc 1m 57s trunk passed with JDK v1.8.0_77
          +1 javadoc 2m 47s trunk passed with JDK v1.7.0_95
          0 mvndep 0m 14s Maven dependency ordering for patch
          +1 mvninstall 1m 28s the patch passed
          +1 compile 5m 57s the patch passed with JDK v1.8.0_77
          +1 cc 5m 57s the patch passed
          +1 javac 5m 57s the patch passed
          +1 compile 6m 42s the patch passed with JDK v1.7.0_95
          +1 cc 6m 42s the patch passed
          +1 javac 6m 42s the patch passed
          +1 checkstyle 1m 22s root: patch generated 0 new + 6 unchanged - 27 fixed = 6 total (was 33)
          +1 mvnsite 1m 49s the patch passed
          +1 mvneclipse 0m 28s the patch passed
          +1 shellcheck 0m 9s There were no new shellcheck issues.
          +1 whitespace 0m 0s Patch has no whitespace issues.
          +1 findbugs 4m 0s the patch passed
          +1 javadoc 2m 1s the patch passed with JDK v1.8.0_77
          +1 javadoc 2m 51s the patch passed with JDK v1.7.0_95
          -1 unit 7m 25s hadoop-common in the patch failed with JDK v1.8.0_77.
          -1 unit 56m 24s hadoop-hdfs in the patch failed with JDK v1.8.0_77.
          -1 unit 7m 34s hadoop-common in the patch failed with JDK v1.7.0_95.
          -1 unit 53m 31s hadoop-hdfs in the patch failed with JDK v1.7.0_95.
          -1 asflicense 0m 25s Patch generated 3 ASF License warnings.
          184m 55s



          Reason Tests
          JDK v1.8.0_77 Failed junit tests hadoop.hdfs.TestHFlush
          JDK v1.8.0_77 Timed out junit tests org.apache.hadoop.util.TestNativeLibraryChecker
          JDK v1.7.0_95 Failed junit tests hadoop.hdfs.TestHFlush
            hadoop.hdfs.server.namenode.TestNameNodeMXBean
          JDK v1.7.0_95 Timed out junit tests org.apache.hadoop.util.TestNativeLibraryChecker



          Subsystem Report/Notes
          Docker Image:yetus/hadoop:fbe3e86
          JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12796187/HADOOP-12563.07.patch
          JIRA Issue HADOOP-12563
          Optional Tests asflicense mvnsite unit shellcheck shelldocs compile javac javadoc mvninstall findbugs checkstyle cc
          uname Linux cb6c526e3dd1 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
          Build tool maven
          Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh
          git revision trunk / e4fc609
          Default Java 1.7.0_95
          Multi-JDK versions /usr/lib/jvm/java-8-oracle:1.8.0_77 /usr/lib/jvm/java-7-openjdk-amd64:1.7.0_95
          shellcheck v0.4.3
          findbugs v3.0.0
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_95.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt
          unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_95.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt
          JDK v1.7.0_95 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/testReport/
          asflicense https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/artifact/patchprocess/patch-asflicense-problems.txt
          modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: .
          Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/console
          Powered by Apache Yetus 0.2.0 http://yetus.apache.org

          This message was automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - -1 overall Vote Subsystem Runtime Comment 0 reexec 0m 9s Docker mode activated. 0 shelldocs 0m 4s Shelldocs was not available. +1 @author 0m 0s The patch does not contain any @author tags. +1 test4tests 0m 0s The patch appears to include 4 new or modified test files. 0 mvndep 0m 14s Maven dependency ordering for branch +1 mvninstall 6m 35s trunk passed +1 compile 5m 57s trunk passed with JDK v1.8.0_77 +1 compile 6m 44s trunk passed with JDK v1.7.0_95 +1 checkstyle 1m 3s trunk passed +1 mvnsite 1m 49s trunk passed +1 mvneclipse 0m 28s trunk passed +1 findbugs 3m 30s trunk passed +1 javadoc 1m 57s trunk passed with JDK v1.8.0_77 +1 javadoc 2m 47s trunk passed with JDK v1.7.0_95 0 mvndep 0m 14s Maven dependency ordering for patch +1 mvninstall 1m 28s the patch passed +1 compile 5m 57s the patch passed with JDK v1.8.0_77 +1 cc 5m 57s the patch passed +1 javac 5m 57s the patch passed +1 compile 6m 42s the patch passed with JDK v1.7.0_95 +1 cc 6m 42s the patch passed +1 javac 6m 42s the patch passed +1 checkstyle 1m 22s root: patch generated 0 new + 6 unchanged - 27 fixed = 6 total (was 33) +1 mvnsite 1m 49s the patch passed +1 mvneclipse 0m 28s the patch passed +1 shellcheck 0m 9s There were no new shellcheck issues. +1 whitespace 0m 0s Patch has no whitespace issues. +1 findbugs 4m 0s the patch passed +1 javadoc 2m 1s the patch passed with JDK v1.8.0_77 +1 javadoc 2m 51s the patch passed with JDK v1.7.0_95 -1 unit 7m 25s hadoop-common in the patch failed with JDK v1.8.0_77. -1 unit 56m 24s hadoop-hdfs in the patch failed with JDK v1.8.0_77. -1 unit 7m 34s hadoop-common in the patch failed with JDK v1.7.0_95. -1 unit 53m 31s hadoop-hdfs in the patch failed with JDK v1.7.0_95. -1 asflicense 0m 25s Patch generated 3 ASF License warnings. 184m 55s Reason Tests JDK v1.8.0_77 Failed junit tests hadoop.hdfs.TestHFlush JDK v1.8.0_77 Timed out junit tests org.apache.hadoop.util.TestNativeLibraryChecker JDK v1.7.0_95 Failed junit tests hadoop.hdfs.TestHFlush   hadoop.hdfs.server.namenode.TestNameNodeMXBean JDK v1.7.0_95 Timed out junit tests org.apache.hadoop.util.TestNativeLibraryChecker Subsystem Report/Notes Docker Image:yetus/hadoop:fbe3e86 JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12796187/HADOOP-12563.07.patch JIRA Issue HADOOP-12563 Optional Tests asflicense mvnsite unit shellcheck shelldocs compile javac javadoc mvninstall findbugs checkstyle cc uname Linux cb6c526e3dd1 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux Build tool maven Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh git revision trunk / e4fc609 Default Java 1.7.0_95 Multi-JDK versions /usr/lib/jvm/java-8-oracle:1.8.0_77 /usr/lib/jvm/java-7-openjdk-amd64:1.7.0_95 shellcheck v0.4.3 findbugs v3.0.0 unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_95.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_95.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt JDK v1.7.0_95 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/testReport/ asflicense https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/artifact/patchprocess/patch-asflicense-problems.txt modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: . Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8979/console Powered by Apache Yetus 0.2.0 http://yetus.apache.org This message was automatically generated.
          Hide
          stevel@apache.org Steve Loughran added a comment -

          Coming along nicely.

          Someone other than me will have to look at marshalling changes.

          1. If a fetch fails, throw an exception rather than just return null.
          2. can you add to interface DtFetcher isTokenRequired(): boolean. that way, its up to the FS, Hbase etc to decide whether this instance needs a token.
          3. Can you do the fetchers for the RM, ATS and KMS, to make sure they fit in with the design too.
          4. CLI needs a way to specify keytab and principal to log in as. Currently it assumes caller is kinited in.
          5. if you make the subcommands take an output stream, then in tests you pass in something that writes to a stringbuffer; check the contents of that to validate. Only in main() calls do you need to pass in System.out. I can see that you play with System.out in tests, but that prevents the code being used in different places.
          6. I'd like the get subcommand (and ideally the others) to be easy to call from other code, especially YARN apps. The inner load/get/remove/renew logic should all be isolated into something anything can call, with command arg parsing taking place outside it.

          minor

          • use SLF4J and you can skip the String.format() clauses in logging
          • Unsure about using Date.toString() to represent dates, especially in a CLI which may be expected to be stable. It's deprecated for a reason.
          Show
          stevel@apache.org Steve Loughran added a comment - Coming along nicely. Someone other than me will have to look at marshalling changes. If a fetch fails, throw an exception rather than just return null. can you add to interface DtFetcher isTokenRequired(): boolean. that way, its up to the FS, Hbase etc to decide whether this instance needs a token. Can you do the fetchers for the RM, ATS and KMS, to make sure they fit in with the design too. CLI needs a way to specify keytab and principal to log in as. Currently it assumes caller is kinited in. if you make the subcommands take an output stream, then in tests you pass in something that writes to a stringbuffer; check the contents of that to validate. Only in main() calls do you need to pass in System.out. I can see that you play with System.out in tests, but that prevents the code being used in different places. I'd like the get subcommand (and ideally the others) to be easy to call from other code, especially YARN apps. The inner load/get/remove/renew logic should all be isolated into something anything can call, with command arg parsing taking place outside it. minor use SLF4J and you can skip the String.format() clauses in logging Unsure about using Date.toString() to represent dates, especially in a CLI which may be expected to be stable. It's deprecated for a reason.
          Hide
          mattpaduano Matthew Paduano added a comment -

          Thanks for the quick reply. I would like to briefly recall the history of this JIRA from my pov. Altiscale needed a utility that could (1) overwrite alias field in a credentials file to enable passing token files outside of firewalls (2) print a base64 encoded string to paste into URL's as a DELEGATION parameter. aw suggested I throw in protobufs, invent DtFetcher and model the command syntax after CredentialsShell. I tried to oblige. But mostly all we wanted was to let users have tokens outside firewalls and use them via HTTP.

          1. Not sure specifically what you mean. If the DtFetcher code throws exc, then it propagates through Get. When auth fails, for example, one might see "java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)];". In response to your comment I did two things: (a) I made HdfsDtFetcher throw an IOE if fs.getDelegationToken(renewer) returns null without throwing an Exception. (b) In DtUtilShell.Get.execute() I trapped a NPE if the fetcher returns null indicating the token should not be aliased. Maybe addDelegationTokens() needs to accept the alias parameter in case returning a single token for aliasing is not sufficient? Is this the "return null" part you are referring to?

          2. I added the method isTokenRequired() to the interface. I am not totally clear on how this is to be used. In HdfsDtFetcher and RmDtFetcher (to be added via YARN4435 once this interface is committed here), I implemented this as "return UserGroupInformation.isSecurityEnabled();".

          3. I have added Hdfs,WebHdfs,SWebHdfs and Rm DtFetcher examples (RmDtFetcher is YARN4435). I am not sure I really know how to do the RM case correctly and hope the review of that JIRA will straighten me out! wrt ATS and KMS, perhaps someone can show me some pointers to docs/examples where I can figure out how to fetch those tokens correctly/completely? I can add new JIRA's with my best guess at how to do it.

          4. I agree it would be cool to have some mechanism to let hadoop know how to kinit for an OS user who is already authenticated and has OS perms to access a keytab, e.g. "kinit -kt $

          {KEYTAB_HOME}

          $

          {PRINCIPAL}

          ". But this seems like a hadoop-wide question and not specific to this utility. Perhaps a follow-up JIRA?

          5. Part of this came from just following CredentialsShell. But I can touch that up and change the test code.

          6. get is somewhat abstracted by DtFetcher.addDelegationTokens(). Renew is abstracted by token.renew(). Is all you are seeking a method that iterates tokens in a Credentials object and calls renew()? matching alias and/or service? Or do you additionally want the same file management operations with file names as argument etc. in an API? Can this be follow-up JIRA?

          minor: 1. will remove String.format() if preferred.
          2. An awful lot of Date is deprecated. But Date.toString() is not deprecated. Nor is the constructor Date(long).
          http://docs.oracle.com/javase/6/docs/api/java/util/Date.html
          https://docs.oracle.com/javase/7/docs/api/java/sql/Date.html
          https://docs.oracle.com/javase/8/docs/api/java/sql/Date.html
          ???

          Show
          mattpaduano Matthew Paduano added a comment - Thanks for the quick reply. I would like to briefly recall the history of this JIRA from my pov. Altiscale needed a utility that could (1) overwrite alias field in a credentials file to enable passing token files outside of firewalls (2) print a base64 encoded string to paste into URL's as a DELEGATION parameter. aw suggested I throw in protobufs, invent DtFetcher and model the command syntax after CredentialsShell. I tried to oblige. But mostly all we wanted was to let users have tokens outside firewalls and use them via HTTP. 1. Not sure specifically what you mean. If the DtFetcher code throws exc, then it propagates through Get. When auth fails, for example, one might see "java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] ;". In response to your comment I did two things: (a) I made HdfsDtFetcher throw an IOE if fs.getDelegationToken(renewer) returns null without throwing an Exception. (b) In DtUtilShell.Get.execute() I trapped a NPE if the fetcher returns null indicating the token should not be aliased. Maybe addDelegationTokens() needs to accept the alias parameter in case returning a single token for aliasing is not sufficient? Is this the "return null" part you are referring to? 2. I added the method isTokenRequired() to the interface. I am not totally clear on how this is to be used. In HdfsDtFetcher and RmDtFetcher (to be added via YARN4435 once this interface is committed here), I implemented this as "return UserGroupInformation.isSecurityEnabled();". 3. I have added Hdfs,WebHdfs,SWebHdfs and Rm DtFetcher examples (RmDtFetcher is YARN4435). I am not sure I really know how to do the RM case correctly and hope the review of that JIRA will straighten me out! wrt ATS and KMS, perhaps someone can show me some pointers to docs/examples where I can figure out how to fetch those tokens correctly/completely? I can add new JIRA's with my best guess at how to do it. 4. I agree it would be cool to have some mechanism to let hadoop know how to kinit for an OS user who is already authenticated and has OS perms to access a keytab, e.g. "kinit -kt $ {KEYTAB_HOME} $ {PRINCIPAL} ". But this seems like a hadoop-wide question and not specific to this utility. Perhaps a follow-up JIRA? 5. Part of this came from just following CredentialsShell. But I can touch that up and change the test code. 6. get is somewhat abstracted by DtFetcher.addDelegationTokens(). Renew is abstracted by token.renew(). Is all you are seeking a method that iterates tokens in a Credentials object and calls renew()? matching alias and/or service? Or do you additionally want the same file management operations with file names as argument etc. in an API? Can this be follow-up JIRA? minor: 1. will remove String.format() if preferred. 2. An awful lot of Date is deprecated. But Date.toString() is not deprecated. Nor is the constructor Date(long). http://docs.oracle.com/javase/6/docs/api/java/util/Date.html https://docs.oracle.com/javase/7/docs/api/java/sql/Date.html https://docs.oracle.com/javase/8/docs/api/java/sql/Date.html ???
          Hide
          mattpaduano Matthew Paduano added a comment -

          for the convenience of reviewers of patch 07, this is a diff from patch 07 to patch 08 (as though in an unmerged branch).

          Show
          mattpaduano Matthew Paduano added a comment - for the convenience of reviewers of patch 07, this is a diff from patch 07 to patch 08 (as though in an unmerged branch).
          Hide
          mattpaduano Matthew Paduano added a comment -

          addresses issues from latest comments

          Show
          mattpaduano Matthew Paduano added a comment - addresses issues from latest comments
          Hide
          hadoopqa Hadoop QA added a comment -
          -1 overall



          Vote Subsystem Runtime Comment
          0 reexec 0m 18s Docker mode activated.
          0 shelldocs 0m 3s Shelldocs was not available.
          +1 @author 0m 0s The patch does not contain any @author tags.
          +1 test4tests 0m 0s The patch appears to include 4 new or modified test files.
          0 mvndep 0m 48s Maven dependency ordering for branch
          +1 mvninstall 6m 42s trunk passed
          +1 compile 6m 10s trunk passed with JDK v1.8.0_74
          +1 compile 6m 49s trunk passed with JDK v1.7.0_95
          +1 checkstyle 1m 4s trunk passed
          +1 mvnsite 1m 49s trunk passed
          +1 mvneclipse 0m 28s trunk passed
          +1 findbugs 3m 27s trunk passed
          +1 javadoc 2m 1s trunk passed with JDK v1.8.0_74
          +1 javadoc 2m 47s trunk passed with JDK v1.7.0_95
          0 mvndep 0m 15s Maven dependency ordering for patch
          +1 mvninstall 1m 29s the patch passed
          +1 compile 5m 59s the patch passed with JDK v1.8.0_74
          +1 cc 5m 59s the patch passed
          +1 javac 5m 59s the patch passed
          +1 compile 6m 48s the patch passed with JDK v1.7.0_95
          +1 cc 6m 48s the patch passed
          +1 javac 6m 48s the patch passed
          +1 checkstyle 1m 4s root: patch generated 0 new + 6 unchanged - 27 fixed = 6 total (was 33)
          +1 mvnsite 1m 49s the patch passed
          +1 mvneclipse 0m 27s the patch passed
          +1 shellcheck 0m 10s There were no new shellcheck issues.
          +1 whitespace 0m 0s Patch has no whitespace issues.
          +1 findbugs 3m 58s the patch passed
          +1 javadoc 2m 2s the patch passed with JDK v1.8.0_74
          +1 javadoc 2m 47s the patch passed with JDK v1.7.0_95
          +1 unit 8m 4s hadoop-common in the patch passed with JDK v1.8.0_74.
          -1 unit 69m 33s hadoop-hdfs in the patch failed with JDK v1.8.0_74.
          +1 unit 8m 43s hadoop-common in the patch passed with JDK v1.7.0_95.
          -1 unit 69m 6s hadoop-hdfs in the patch failed with JDK v1.7.0_95.
          +1 asflicense 0m 29s Patch does not generate ASF License warnings.
          216m 36s



          Reason Tests
          JDK v1.8.0_74 Failed junit tests hadoop.TestRefreshCallQueue
            hadoop.hdfs.TestEncryptionZones
            hadoop.hdfs.TestRollingUpgrade
            hadoop.hdfs.shortcircuit.TestShortCircuitCache
          JDK v1.7.0_95 Failed junit tests hadoop.TestRefreshCallQueue



          Subsystem Report/Notes
          Docker Image:yetus/hadoop:fbe3e86
          JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12796441/HADOOP-12563.08.patch
          JIRA Issue HADOOP-12563
          Optional Tests asflicense mvnsite unit shellcheck shelldocs compile javac javadoc mvninstall findbugs checkstyle cc
          uname Linux 4ab9a5a39f60 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
          Build tool maven
          Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh
          git revision trunk / aac4d65
          Default Java 1.7.0_95
          Multi-JDK versions /usr/lib/jvm/java-8-oracle:1.8.0_74 /usr/lib/jvm/java-7-openjdk-amd64:1.7.0_95
          shellcheck v0.4.3
          findbugs v3.0.0
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8994/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_74.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8994/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt
          unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/8994/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_74.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8994/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt
          JDK v1.7.0_95 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/8994/testReport/
          modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: .
          Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8994/console
          Powered by Apache Yetus 0.2.0 http://yetus.apache.org

          This message was automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - -1 overall Vote Subsystem Runtime Comment 0 reexec 0m 18s Docker mode activated. 0 shelldocs 0m 3s Shelldocs was not available. +1 @author 0m 0s The patch does not contain any @author tags. +1 test4tests 0m 0s The patch appears to include 4 new or modified test files. 0 mvndep 0m 48s Maven dependency ordering for branch +1 mvninstall 6m 42s trunk passed +1 compile 6m 10s trunk passed with JDK v1.8.0_74 +1 compile 6m 49s trunk passed with JDK v1.7.0_95 +1 checkstyle 1m 4s trunk passed +1 mvnsite 1m 49s trunk passed +1 mvneclipse 0m 28s trunk passed +1 findbugs 3m 27s trunk passed +1 javadoc 2m 1s trunk passed with JDK v1.8.0_74 +1 javadoc 2m 47s trunk passed with JDK v1.7.0_95 0 mvndep 0m 15s Maven dependency ordering for patch +1 mvninstall 1m 29s the patch passed +1 compile 5m 59s the patch passed with JDK v1.8.0_74 +1 cc 5m 59s the patch passed +1 javac 5m 59s the patch passed +1 compile 6m 48s the patch passed with JDK v1.7.0_95 +1 cc 6m 48s the patch passed +1 javac 6m 48s the patch passed +1 checkstyle 1m 4s root: patch generated 0 new + 6 unchanged - 27 fixed = 6 total (was 33) +1 mvnsite 1m 49s the patch passed +1 mvneclipse 0m 27s the patch passed +1 shellcheck 0m 10s There were no new shellcheck issues. +1 whitespace 0m 0s Patch has no whitespace issues. +1 findbugs 3m 58s the patch passed +1 javadoc 2m 2s the patch passed with JDK v1.8.0_74 +1 javadoc 2m 47s the patch passed with JDK v1.7.0_95 +1 unit 8m 4s hadoop-common in the patch passed with JDK v1.8.0_74. -1 unit 69m 33s hadoop-hdfs in the patch failed with JDK v1.8.0_74. +1 unit 8m 43s hadoop-common in the patch passed with JDK v1.7.0_95. -1 unit 69m 6s hadoop-hdfs in the patch failed with JDK v1.7.0_95. +1 asflicense 0m 29s Patch does not generate ASF License warnings. 216m 36s Reason Tests JDK v1.8.0_74 Failed junit tests hadoop.TestRefreshCallQueue   hadoop.hdfs.TestEncryptionZones   hadoop.hdfs.TestRollingUpgrade   hadoop.hdfs.shortcircuit.TestShortCircuitCache JDK v1.7.0_95 Failed junit tests hadoop.TestRefreshCallQueue Subsystem Report/Notes Docker Image:yetus/hadoop:fbe3e86 JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12796441/HADOOP-12563.08.patch JIRA Issue HADOOP-12563 Optional Tests asflicense mvnsite unit shellcheck shelldocs compile javac javadoc mvninstall findbugs checkstyle cc uname Linux 4ab9a5a39f60 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux Build tool maven Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh git revision trunk / aac4d65 Default Java 1.7.0_95 Multi-JDK versions /usr/lib/jvm/java-8-oracle:1.8.0_74 /usr/lib/jvm/java-7-openjdk-amd64:1.7.0_95 shellcheck v0.4.3 findbugs v3.0.0 unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8994/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_74.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/8994/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/8994/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_74.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/8994/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt JDK v1.7.0_95 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/8994/testReport/ modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: . Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/8994/console Powered by Apache Yetus 0.2.0 http://yetus.apache.org This message was automatically generated.
          Hide
          stevel@apache.org Steve Loughran added a comment -

          Here are the use cases I've encountered related to this

          • saving tokens for a principal to a file (HDFS, RM, ATS), so that a process can be started in an env with HADOOP_TOKEN_FILE_LOCATION pointing at the file. This lets me test oozie deployment outside of oozie.
          • spark yarn client having to pick up tokens for HBase, Hive and others. This is done on a case-by-case basis through introspection ugliness. With a standard interface, all you'd need to do is load the implementation and invoke.
          • spark AM doing ticket-based token retrieval, for propagation to executors in containers.

          So: one similar, two within an app, all benefiting from a standard API. Use case #1 can be handled by your CLI tool, if it does keytab and principal

          1. return values: There was a comment about returning null; I want to make sure that that is not the case, the failures -> exceptions
          2. the rationale for isTokenRequired() is related to other uses. Example, spark only needs an hbase token if (a) hbase is on the classpath, (b) the hbase-site.xml provides the binding for hbase and indicates that authentication is needed. You may have unauthed hbase within a kerberized cluster. Similarly for RM web access, the implementation would look at the auth method for the web UI; again, there may be none, even on a secure cluster.

          3. RM And ATS code can be found in these classes

          https://github.com/apache/incubator-slider/blob/develop/slider-core/src/main/java/org/apache/slider/core/launch/CredentialUtils.java
          https://github.com/apache/incubator-slider/blob/develop/slider-core/src/main/java/org/apache/slider/client/TokensOperation.java

          4. keytabs & principals

          I agree it would be cool to have some mechanism to let hadoop know how to kinit for an OS user who is already authenticated and has OS perms to access a keytab, e.g. "kinit -kt $

          it's called UserGroupInformation.loginUserFromKeytabAndReturnUGI(), and is easy to use, provided you make it the first thing you do in your code after reading all config, and before talking to any services. Look in TokensOperation for the code to lift

          Finally, note that token acquisition on HA clusters is tricker than you' expect ... we'll all need to review that code.

          Show
          stevel@apache.org Steve Loughran added a comment - Here are the use cases I've encountered related to this saving tokens for a principal to a file (HDFS, RM, ATS), so that a process can be started in an env with HADOOP_TOKEN_FILE_LOCATION pointing at the file. This lets me test oozie deployment outside of oozie. spark yarn client having to pick up tokens for HBase, Hive and others. This is done on a case-by-case basis through introspection ugliness. With a standard interface, all you'd need to do is load the implementation and invoke. spark AM doing ticket-based token retrieval, for propagation to executors in containers. So: one similar, two within an app, all benefiting from a standard API. Use case #1 can be handled by your CLI tool, if it does keytab and principal 1. return values: There was a comment about returning null; I want to make sure that that is not the case, the failures -> exceptions 2. the rationale for isTokenRequired() is related to other uses. Example, spark only needs an hbase token if (a) hbase is on the classpath, (b) the hbase-site.xml provides the binding for hbase and indicates that authentication is needed. You may have unauthed hbase within a kerberized cluster. Similarly for RM web access, the implementation would look at the auth method for the web UI; again, there may be none, even on a secure cluster. 3. RM And ATS code can be found in these classes https://github.com/apache/incubator-slider/blob/develop/slider-core/src/main/java/org/apache/slider/core/launch/CredentialUtils.java https://github.com/apache/incubator-slider/blob/develop/slider-core/src/main/java/org/apache/slider/client/TokensOperation.java 4. keytabs & principals I agree it would be cool to have some mechanism to let hadoop know how to kinit for an OS user who is already authenticated and has OS perms to access a keytab, e.g. "kinit -kt $ it's called UserGroupInformation.loginUserFromKeytabAndReturnUGI() , and is easy to use, provided you make it the first thing you do in your code after reading all config, and before talking to any services. Look in TokensOperation for the code to lift Finally, note that token acquisition on HA clusters is tricker than you' expect ... we'll all need to review that code.
          Hide
          mattpaduano Matthew Paduano added a comment -

          Thank you very much for the clarifications. I appreciate it.

          I will try to incorporate these concerns and get a new patch uploaded
          promptly. I will also create some new JIRA's for the individual fetchers.

          Show
          mattpaduano Matthew Paduano added a comment - Thank you very much for the clarifications. I appreciate it. I will try to incorporate these concerns and get a new patch uploaded promptly. I will also create some new JIRA's for the individual fetchers.
          Hide
          mattpaduano Matthew Paduano added a comment -
          • adds flags to pass keytab/principal and login
          • factors file operations into separate class
          • adds Configuration to DtFetcher interface
          Show
          mattpaduano Matthew Paduano added a comment - adds flags to pass keytab/principal and login factors file operations into separate class adds Configuration to DtFetcher interface
          Hide
          hadoopqa Hadoop QA added a comment -
          -1 overall



          Vote Subsystem Runtime Comment
          0 reexec 0m 14s Docker mode activated.
          0 shelldocs 0m 4s Shelldocs was not available.
          +1 @author 0m 0s The patch does not contain any @author tags.
          +1 test4tests 0m 0s The patch appears to include 4 new or modified test files.
          0 mvndep 0m 41s Maven dependency ordering for branch
          +1 mvninstall 7m 39s trunk passed
          +1 compile 6m 51s trunk passed with JDK v1.8.0_77
          +1 compile 7m 32s trunk passed with JDK v1.7.0_95
          +1 checkstyle 1m 12s trunk passed
          +1 mvnsite 2m 3s trunk passed
          +1 mvneclipse 0m 30s trunk passed
          +1 findbugs 3m 51s trunk passed
          +1 javadoc 2m 17s trunk passed with JDK v1.8.0_77
          +1 javadoc 2m 58s trunk passed with JDK v1.7.0_95
          0 mvndep 0m 15s Maven dependency ordering for patch
          +1 mvninstall 1m 32s the patch passed
          +1 compile 5m 47s the patch passed with JDK v1.8.0_77
          -1 cc 7m 34s root-jdk1.8.0_77 with JDK v1.8.0_77 generated 1 new + 10 unchanged - 1 fixed = 11 total (was 11)
          +1 cc 5m 47s the patch passed
          +1 javac 5m 47s the patch passed
          +1 compile 6m 43s the patch passed with JDK v1.7.0_95
          +1 cc 6m 43s the patch passed
          +1 javac 6m 43s the patch passed
          +1 checkstyle 1m 5s root: patch generated 0 new + 6 unchanged - 27 fixed = 6 total (was 33)
          +1 mvnsite 1m 47s the patch passed
          +1 mvneclipse 0m 28s the patch passed
          +1 shellcheck 0m 10s There were no new shellcheck issues.
          +1 whitespace 0m 0s Patch has no whitespace issues.
          +1 findbugs 4m 1s the patch passed
          +1 javadoc 2m 0s the patch passed with JDK v1.8.0_77
          +1 javadoc 2m 54s the patch passed with JDK v1.7.0_95
          -1 unit 7m 36s hadoop-common in the patch failed with JDK v1.8.0_77.
          -1 unit 56m 43s hadoop-hdfs in the patch failed with JDK v1.8.0_77.
          +1 unit 9m 10s hadoop-common in the patch passed with JDK v1.7.0_95.
          -1 unit 54m 1s hadoop-hdfs in the patch failed with JDK v1.7.0_95.
          +1 asflicense 0m 26s Patch does not generate ASF License warnings.
          191m 54s



          Reason Tests
          JDK v1.8.0_77 Failed junit tests hadoop.net.TestDNS
            hadoop.hdfs.server.namenode.ha.TestInitializeSharedEdits
            hadoop.hdfs.server.namenode.TestEditLog
            hadoop.hdfs.server.datanode.TestDataNodeMetrics
          JDK v1.7.0_95 Failed junit tests hadoop.hdfs.server.namenode.TestEditLog
            hadoop.hdfs.TestHFlush
            hadoop.hdfs.server.namenode.TestNameNodeMetadataConsistency
            hadoop.hdfs.server.namenode.ha.TestHASafeMode



          Subsystem Report/Notes
          Docker Image:yetus/hadoop:fbe3e86
          JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12797676/HADOOP-12563.09.patch
          JIRA Issue HADOOP-12563
          Optional Tests asflicense mvnsite unit shellcheck shelldocs compile javac javadoc mvninstall findbugs checkstyle cc
          uname Linux e4e6c06edb0e 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
          Build tool maven
          Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh
          git revision trunk / e82f961
          Default Java 1.7.0_95
          Multi-JDK versions /usr/lib/jvm/java-8-oracle:1.8.0_77 /usr/lib/jvm/java-7-openjdk-amd64:1.7.0_95
          shellcheck v0.4.3
          findbugs v3.0.0
          cc root-jdk1.8.0_77: https://builds.apache.org/job/PreCommit-HADOOP-Build/9043/artifact/patchprocess/diff-compile-cc-root-jdk1.8.0_77.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9043/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9043/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9043/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt
          unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/9043/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9043/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9043/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt
          JDK v1.7.0_95 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/9043/testReport/
          modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: .
          Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/9043/console
          Powered by Apache Yetus 0.2.0 http://yetus.apache.org

          This message was automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - -1 overall Vote Subsystem Runtime Comment 0 reexec 0m 14s Docker mode activated. 0 shelldocs 0m 4s Shelldocs was not available. +1 @author 0m 0s The patch does not contain any @author tags. +1 test4tests 0m 0s The patch appears to include 4 new or modified test files. 0 mvndep 0m 41s Maven dependency ordering for branch +1 mvninstall 7m 39s trunk passed +1 compile 6m 51s trunk passed with JDK v1.8.0_77 +1 compile 7m 32s trunk passed with JDK v1.7.0_95 +1 checkstyle 1m 12s trunk passed +1 mvnsite 2m 3s trunk passed +1 mvneclipse 0m 30s trunk passed +1 findbugs 3m 51s trunk passed +1 javadoc 2m 17s trunk passed with JDK v1.8.0_77 +1 javadoc 2m 58s trunk passed with JDK v1.7.0_95 0 mvndep 0m 15s Maven dependency ordering for patch +1 mvninstall 1m 32s the patch passed +1 compile 5m 47s the patch passed with JDK v1.8.0_77 -1 cc 7m 34s root-jdk1.8.0_77 with JDK v1.8.0_77 generated 1 new + 10 unchanged - 1 fixed = 11 total (was 11) +1 cc 5m 47s the patch passed +1 javac 5m 47s the patch passed +1 compile 6m 43s the patch passed with JDK v1.7.0_95 +1 cc 6m 43s the patch passed +1 javac 6m 43s the patch passed +1 checkstyle 1m 5s root: patch generated 0 new + 6 unchanged - 27 fixed = 6 total (was 33) +1 mvnsite 1m 47s the patch passed +1 mvneclipse 0m 28s the patch passed +1 shellcheck 0m 10s There were no new shellcheck issues. +1 whitespace 0m 0s Patch has no whitespace issues. +1 findbugs 4m 1s the patch passed +1 javadoc 2m 0s the patch passed with JDK v1.8.0_77 +1 javadoc 2m 54s the patch passed with JDK v1.7.0_95 -1 unit 7m 36s hadoop-common in the patch failed with JDK v1.8.0_77. -1 unit 56m 43s hadoop-hdfs in the patch failed with JDK v1.8.0_77. +1 unit 9m 10s hadoop-common in the patch passed with JDK v1.7.0_95. -1 unit 54m 1s hadoop-hdfs in the patch failed with JDK v1.7.0_95. +1 asflicense 0m 26s Patch does not generate ASF License warnings. 191m 54s Reason Tests JDK v1.8.0_77 Failed junit tests hadoop.net.TestDNS   hadoop.hdfs.server.namenode.ha.TestInitializeSharedEdits   hadoop.hdfs.server.namenode.TestEditLog   hadoop.hdfs.server.datanode.TestDataNodeMetrics JDK v1.7.0_95 Failed junit tests hadoop.hdfs.server.namenode.TestEditLog   hadoop.hdfs.TestHFlush   hadoop.hdfs.server.namenode.TestNameNodeMetadataConsistency   hadoop.hdfs.server.namenode.ha.TestHASafeMode Subsystem Report/Notes Docker Image:yetus/hadoop:fbe3e86 JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12797676/HADOOP-12563.09.patch JIRA Issue HADOOP-12563 Optional Tests asflicense mvnsite unit shellcheck shelldocs compile javac javadoc mvninstall findbugs checkstyle cc uname Linux e4e6c06edb0e 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux Build tool maven Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh git revision trunk / e82f961 Default Java 1.7.0_95 Multi-JDK versions /usr/lib/jvm/java-8-oracle:1.8.0_77 /usr/lib/jvm/java-7-openjdk-amd64:1.7.0_95 shellcheck v0.4.3 findbugs v3.0.0 cc root-jdk1.8.0_77: https://builds.apache.org/job/PreCommit-HADOOP-Build/9043/artifact/patchprocess/diff-compile-cc-root-jdk1.8.0_77.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9043/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9043/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9043/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/9043/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9043/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9043/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt JDK v1.7.0_95 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/9043/testReport/ modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: . Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/9043/console Powered by Apache Yetus 0.2.0 http://yetus.apache.org This message was automatically generated.
          Hide
          raviprak Ravi Prakash added a comment -

          Thanks for the patch and all your work Matt! Thanks also for your reviews and guidance Steve! I see that this patch has come a long way

          1. Could you please file and patch a follow-on JIRA for adding documentation. Perhaps here : https://github.com/apache/hadoop/blob/trunk/hadoop-common-project/hadoop-common/src/site/markdown/CommandsManual.md
          2. Do you think it'd be good to create an Enum for different versions?
                if (version == 0) {
                  readFields(in);
                } else if (version == 1) {
                  readProtos(in);
                }
          3. Do you know if there would be any difference between CredentialsKVProto.newBuilder().setAlias(e.getKey().toString()) and what you have (CredentialsKVProto.newBuilder().setAliasBytes(ByteString.copyFrom(e.getKey().getBytes(), 0, e.getKey().getLength()))) ? Would one be encoded/decoded differently on varying platforms? Looking into the generated code, I see in one case alias_ would be an Object of type String }} vs {{ByteString in the other case. I guess the deviation may only be in the case when the encoding is different than UTF-8 . Do you know if we should prefer one way over the other?
          4. Could setKindBytes(ByteString.copyFrom(this.getKind().getBytes(), 0, this.getKind().getLength())) simply be setKindBytes(ByteString.copyFrom(this.getKind().getBytes()))?
          Show
          raviprak Ravi Prakash added a comment - Thanks for the patch and all your work Matt! Thanks also for your reviews and guidance Steve! I see that this patch has come a long way Could you please file and patch a follow-on JIRA for adding documentation. Perhaps here : https://github.com/apache/hadoop/blob/trunk/hadoop-common-project/hadoop-common/src/site/markdown/CommandsManual.md Do you think it'd be good to create an Enum for different versions? if (version == 0) { readFields(in); } else if (version == 1) { readProtos(in); } Do you know if there would be any difference between CredentialsKVProto.newBuilder().setAlias(e.getKey().toString()) and what you have ( CredentialsKVProto.newBuilder().setAliasBytes(ByteString.copyFrom(e.getKey().getBytes(), 0, e.getKey().getLength())) ) ? Would one be encoded/decoded differently on varying platforms? Looking into the generated code, I see in one case alias_ would be an Object of type String }} vs {{ByteString in the other case. I guess the deviation may only be in the case when the encoding is different than UTF-8 . Do you know if we should prefer one way over the other? Could setKindBytes(ByteString.copyFrom(this.getKind().getBytes(), 0, this.getKind().getLength())) simply be setKindBytes(ByteString.copyFrom(this.getKind().getBytes())) ?
          Hide
          mattpaduano Matthew Paduano added a comment -

          re 1: I added doc to CommandsManual.md and also javadoc for DtFileOperations in the latest patch.

          re 2: The types of TOKEN_STORAGE_VERSION and OLD_TOKEN_STORAGE_VERSION are byte.
          They need to be byte as they are passed directly to stream write() methods.
          They are private fields and there should not be very many of them. I think an enum is overkill.

          I changed this code to use those symbols directly and avoid the bare literals 0 and 1.

          re 3: java strings are UTF16 and there is a change of encoding twice in first example. there is a copy
          from one UTF8 buffer (io.Text) to a UTF16 buffer (String) and then back to a UTF8 buffer (ByteString).
          In the other case, the byte[] from the io.Text object is directly copied to the byte[] of the ByteString object
          (which is interned, like java Strings). So there is just one copy in the copyFrom case, and no encoding
          switch. This is what one should prefer and is what is being used in other proto code in hadoop.

          re 4: I see examples around the code base using both forms. I think they are the same.
          I changed to the shorter form here.

          Show
          mattpaduano Matthew Paduano added a comment - re 1: I added doc to CommandsManual.md and also javadoc for DtFileOperations in the latest patch. re 2: The types of TOKEN_STORAGE_VERSION and OLD_TOKEN_STORAGE_VERSION are byte. They need to be byte as they are passed directly to stream write() methods. They are private fields and there should not be very many of them. I think an enum is overkill. I changed this code to use those symbols directly and avoid the bare literals 0 and 1. re 3: java strings are UTF16 and there is a change of encoding twice in first example. there is a copy from one UTF8 buffer (io.Text) to a UTF16 buffer (String) and then back to a UTF8 buffer (ByteString). In the other case, the byte[] from the io.Text object is directly copied to the byte[] of the ByteString object (which is interned, like java Strings). So there is just one copy in the copyFrom case, and no encoding switch. This is what one should prefer and is what is being used in other proto code in hadoop. re 4: I see examples around the code base using both forms. I think they are the same. I changed to the shorter form here.
          Hide
          hadoopqa Hadoop QA added a comment -
          -1 overall



          Vote Subsystem Runtime Comment
          0 reexec 0m 16s Docker mode activated.
          0 shelldocs 0m 5s Shelldocs was not available.
          +1 @author 0m 0s The patch does not contain any @author tags.
          +1 test4tests 0m 0s The patch appears to include 4 new or modified test files.
          0 mvndep 0m 15s Maven dependency ordering for branch
          +1 mvninstall 6m 51s trunk passed
          +1 compile 6m 6s trunk passed with JDK v1.8.0_77
          +1 compile 6m 56s trunk passed with JDK v1.7.0_95
          +1 checkstyle 1m 5s trunk passed
          +1 mvnsite 1m 50s trunk passed
          +1 mvneclipse 0m 29s trunk passed
          +1 findbugs 3m 37s trunk passed
          +1 javadoc 2m 0s trunk passed with JDK v1.8.0_77
          +1 javadoc 2m 48s trunk passed with JDK v1.7.0_95
          0 mvndep 0m 14s Maven dependency ordering for patch
          +1 mvninstall 1m 28s the patch passed
          +1 compile 5m 42s the patch passed with JDK v1.8.0_77
          +1 cc 5m 42s the patch passed
          +1 javac 5m 42s the patch passed
          +1 compile 6m 44s the patch passed with JDK v1.7.0_95
          +1 cc 6m 44s the patch passed
          +1 javac 6m 44s the patch passed
          +1 checkstyle 1m 5s root: patch generated 0 new + 6 unchanged - 27 fixed = 6 total (was 33)
          +1 mvnsite 1m 47s the patch passed
          +1 mvneclipse 0m 28s the patch passed
          +1 shellcheck 0m 9s There were no new shellcheck issues.
          +1 whitespace 0m 0s Patch has no whitespace issues.
          +1 findbugs 4m 0s the patch passed
          +1 javadoc 2m 3s the patch passed with JDK v1.8.0_77
          +1 javadoc 2m 50s the patch passed with JDK v1.7.0_95
          -1 unit 6m 49s hadoop-common in the patch failed with JDK v1.8.0_77.
          -1 unit 75m 59s hadoop-hdfs in the patch failed with JDK v1.8.0_77.
          -1 unit 7m 59s hadoop-common in the patch failed with JDK v1.7.0_95.
          -1 unit 79m 33s hadoop-hdfs in the patch failed with JDK v1.7.0_95.
          +1 asflicense 0m 24s Patch does not generate ASF License warnings.
          230m 48s



          Reason Tests
          JDK v1.8.0_77 Failed junit tests hadoop.util.TestGenericOptionsParser
            hadoop.hdfs.server.datanode.TestDataNodeHotSwapVolumes
            hadoop.hdfs.TestReadStripedFileWithMissingBlocks
            hadoop.hdfs.shortcircuit.TestShortCircuitLocalRead
            hadoop.hdfs.TestDFSUpgradeFromImage
            hadoop.hdfs.TestDFSStripedOutputStreamWithFailure
            hadoop.hdfs.server.namenode.TestEditLog
          JDK v1.8.0_77 Timed out junit tests org.apache.hadoop.hdfs.TestWriteReadStripedFile
            org.apache.hadoop.hdfs.TestReadStripedFileWithDecoding
          JDK v1.7.0_95 Failed junit tests hadoop.util.TestGenericOptionsParser
            hadoop.hdfs.TestHFlush
            hadoop.hdfs.TestReadStripedFileWithMissingBlocks
            hadoop.hdfs.server.namenode.snapshot.TestOpenFilesWithSnapshot
            hadoop.fs.contract.hdfs.TestHDFSContractDelete
            hadoop.hdfs.shortcircuit.TestShortCircuitLocalRead
            hadoop.hdfs.server.namenode.TestHDFSConcat
            hadoop.hdfs.server.namenode.TestListCorruptFileBlocks
            hadoop.hdfs.TestDFSStripedOutputStreamWithFailure
            hadoop.hdfs.TestDFSStripedOutputStreamWithFailure070
            hadoop.hdfs.server.namenode.TestFavoredNodesEndToEnd
            hadoop.hdfs.server.namenode.TestEditLog
          JDK v1.7.0_95 Timed out junit tests org.apache.hadoop.hdfs.TestWriteReadStripedFile
            org.apache.hadoop.hdfs.TestReadStripedFileWithDecoding



          Subsystem Report/Notes
          Docker Image:yetus/hadoop:fbe3e86
          JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12798385/HADOOP-12563.10.patch
          JIRA Issue HADOOP-12563
          Optional Tests asflicense mvnsite unit shellcheck shelldocs compile javac javadoc mvninstall findbugs checkstyle cc
          uname Linux 542c5681cae0 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
          Build tool maven
          Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh
          git revision trunk / 042a3ae
          Default Java 1.7.0_95
          Multi-JDK versions /usr/lib/jvm/java-8-oracle:1.8.0_77 /usr/lib/jvm/java-7-openjdk-amd64:1.7.0_95
          shellcheck v0.4.3
          findbugs v3.0.0
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_95.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt
          unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_95.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt
          JDK v1.7.0_95 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/testReport/
          modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: .
          Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/console
          Powered by Apache Yetus 0.2.0 http://yetus.apache.org

          This message was automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - -1 overall Vote Subsystem Runtime Comment 0 reexec 0m 16s Docker mode activated. 0 shelldocs 0m 5s Shelldocs was not available. +1 @author 0m 0s The patch does not contain any @author tags. +1 test4tests 0m 0s The patch appears to include 4 new or modified test files. 0 mvndep 0m 15s Maven dependency ordering for branch +1 mvninstall 6m 51s trunk passed +1 compile 6m 6s trunk passed with JDK v1.8.0_77 +1 compile 6m 56s trunk passed with JDK v1.7.0_95 +1 checkstyle 1m 5s trunk passed +1 mvnsite 1m 50s trunk passed +1 mvneclipse 0m 29s trunk passed +1 findbugs 3m 37s trunk passed +1 javadoc 2m 0s trunk passed with JDK v1.8.0_77 +1 javadoc 2m 48s trunk passed with JDK v1.7.0_95 0 mvndep 0m 14s Maven dependency ordering for patch +1 mvninstall 1m 28s the patch passed +1 compile 5m 42s the patch passed with JDK v1.8.0_77 +1 cc 5m 42s the patch passed +1 javac 5m 42s the patch passed +1 compile 6m 44s the patch passed with JDK v1.7.0_95 +1 cc 6m 44s the patch passed +1 javac 6m 44s the patch passed +1 checkstyle 1m 5s root: patch generated 0 new + 6 unchanged - 27 fixed = 6 total (was 33) +1 mvnsite 1m 47s the patch passed +1 mvneclipse 0m 28s the patch passed +1 shellcheck 0m 9s There were no new shellcheck issues. +1 whitespace 0m 0s Patch has no whitespace issues. +1 findbugs 4m 0s the patch passed +1 javadoc 2m 3s the patch passed with JDK v1.8.0_77 +1 javadoc 2m 50s the patch passed with JDK v1.7.0_95 -1 unit 6m 49s hadoop-common in the patch failed with JDK v1.8.0_77. -1 unit 75m 59s hadoop-hdfs in the patch failed with JDK v1.8.0_77. -1 unit 7m 59s hadoop-common in the patch failed with JDK v1.7.0_95. -1 unit 79m 33s hadoop-hdfs in the patch failed with JDK v1.7.0_95. +1 asflicense 0m 24s Patch does not generate ASF License warnings. 230m 48s Reason Tests JDK v1.8.0_77 Failed junit tests hadoop.util.TestGenericOptionsParser   hadoop.hdfs.server.datanode.TestDataNodeHotSwapVolumes   hadoop.hdfs.TestReadStripedFileWithMissingBlocks   hadoop.hdfs.shortcircuit.TestShortCircuitLocalRead   hadoop.hdfs.TestDFSUpgradeFromImage   hadoop.hdfs.TestDFSStripedOutputStreamWithFailure   hadoop.hdfs.server.namenode.TestEditLog JDK v1.8.0_77 Timed out junit tests org.apache.hadoop.hdfs.TestWriteReadStripedFile   org.apache.hadoop.hdfs.TestReadStripedFileWithDecoding JDK v1.7.0_95 Failed junit tests hadoop.util.TestGenericOptionsParser   hadoop.hdfs.TestHFlush   hadoop.hdfs.TestReadStripedFileWithMissingBlocks   hadoop.hdfs.server.namenode.snapshot.TestOpenFilesWithSnapshot   hadoop.fs.contract.hdfs.TestHDFSContractDelete   hadoop.hdfs.shortcircuit.TestShortCircuitLocalRead   hadoop.hdfs.server.namenode.TestHDFSConcat   hadoop.hdfs.server.namenode.TestListCorruptFileBlocks   hadoop.hdfs.TestDFSStripedOutputStreamWithFailure   hadoop.hdfs.TestDFSStripedOutputStreamWithFailure070   hadoop.hdfs.server.namenode.TestFavoredNodesEndToEnd   hadoop.hdfs.server.namenode.TestEditLog JDK v1.7.0_95 Timed out junit tests org.apache.hadoop.hdfs.TestWriteReadStripedFile   org.apache.hadoop.hdfs.TestReadStripedFileWithDecoding Subsystem Report/Notes Docker Image:yetus/hadoop:fbe3e86 JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12798385/HADOOP-12563.10.patch JIRA Issue HADOOP-12563 Optional Tests asflicense mvnsite unit shellcheck shelldocs compile javac javadoc mvninstall findbugs checkstyle cc uname Linux 542c5681cae0 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux Build tool maven Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh git revision trunk / 042a3ae Default Java 1.7.0_95 Multi-JDK versions /usr/lib/jvm/java-8-oracle:1.8.0_77 /usr/lib/jvm/java-7-openjdk-amd64:1.7.0_95 shellcheck v0.4.3 findbugs v3.0.0 unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_95.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_95.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt JDK v1.7.0_95 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/testReport/ modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: . Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/9075/console Powered by Apache Yetus 0.2.0 http://yetus.apache.org This message was automatically generated.
          Hide
          mattpaduano Matthew Paduano added a comment -

          One of those test failures (TestGenericOptionsParser) is caused by a change in patch 10.
          I am attaching patch 11 to fix that problem.

          Show
          mattpaduano Matthew Paduano added a comment - One of those test failures (TestGenericOptionsParser) is caused by a change in patch 10. I am attaching patch 11 to fix that problem.
          Hide
          mattpaduano Matthew Paduano added a comment -

          use long form of ByteString.copyFrom to protect against
          too-long buffers in io.Text objects' getBytes().

          Show
          mattpaduano Matthew Paduano added a comment - use long form of ByteString.copyFrom to protect against too-long buffers in io.Text objects' getBytes().
          Hide
          hadoopqa Hadoop QA added a comment -
          -1 overall



          Vote Subsystem Runtime Comment
          0 reexec 0m 17s Docker mode activated.
          0 shelldocs 0m 3s Shelldocs was not available.
          +1 @author 0m 0s The patch does not contain any @author tags.
          +1 test4tests 0m 0s The patch appears to include 4 new or modified test files.
          0 mvndep 0m 15s Maven dependency ordering for branch
          +1 mvninstall 8m 19s trunk passed
          +1 compile 10m 56s trunk passed with JDK v1.8.0_77
          +1 compile 7m 49s trunk passed with JDK v1.7.0_95
          +1 checkstyle 1m 12s trunk passed
          +1 mvnsite 2m 7s trunk passed
          +1 mvneclipse 0m 31s trunk passed
          +1 findbugs 3m 51s trunk passed
          +1 javadoc 2m 22s trunk passed with JDK v1.8.0_77
          +1 javadoc 3m 10s trunk passed with JDK v1.7.0_95
          0 mvndep 0m 15s Maven dependency ordering for patch
          +1 mvninstall 1m 40s the patch passed
          +1 compile 8m 46s the patch passed with JDK v1.8.0_77
          +1 cc 8m 46s the patch passed
          +1 javac 8m 46s the patch passed
          +1 compile 7m 56s the patch passed with JDK v1.7.0_95
          +1 cc 7m 56s the patch passed
          +1 javac 7m 56s the patch passed
          -1 checkstyle 1m 5s root: patch generated 1 new + 6 unchanged - 27 fixed = 7 total (was 33)
          +1 mvnsite 1m 59s the patch passed
          +1 mvneclipse 0m 31s the patch passed
          +1 shellcheck 0m 10s There were no new shellcheck issues.
          +1 whitespace 0m 0s Patch has no whitespace issues.
          +1 findbugs 4m 13s the patch passed
          +1 javadoc 2m 23s the patch passed with JDK v1.8.0_77
          +1 javadoc 3m 14s the patch passed with JDK v1.7.0_95
          -1 unit 10m 4s hadoop-common in the patch failed with JDK v1.8.0_77.
          -1 unit 100m 20s hadoop-hdfs in the patch failed with JDK v1.8.0_77.
          +1 unit 10m 57s hadoop-common in the patch passed with JDK v1.7.0_95.
          -1 unit 100m 58s hadoop-hdfs in the patch failed with JDK v1.7.0_95.
          +1 asflicense 0m 32s Patch does not generate ASF License warnings.
          297m 21s



          Reason Tests
          JDK v1.8.0_77 Failed junit tests hadoop.net.TestClusterTopology
            hadoop.hdfs.server.datanode.TestDataNodeUUID
            hadoop.hdfs.server.datanode.fsdataset.impl.TestLazyPersistPolicy
            hadoop.hdfs.TestDFSStripedOutputStreamWithFailure
            hadoop.hdfs.TestReadStripedFileWithMissingBlocks
            hadoop.hdfs.shortcircuit.TestShortCircuitLocalRead
            hadoop.hdfs.security.TestDelegationTokenForProxyUser
            hadoop.hdfs.TestFileAppend
            hadoop.hdfs.server.blockmanagement.TestSequentialBlockGroupId
            hadoop.hdfs.server.datanode.fsdataset.impl.TestLazyPersistReplicaRecovery
            hadoop.hdfs.server.datanode.TestDirectoryScanner
          JDK v1.8.0_77 Timed out junit tests org.apache.hadoop.hdfs.TestWriteReadStripedFile
            org.apache.hadoop.hdfs.TestReadStripedFileWithDecoding
          JDK v1.7.0_95 Failed junit tests hadoop.hdfs.TestDFSStripedOutputStreamWithFailure
            hadoop.hdfs.server.datanode.fsdataset.impl.TestInterDatanodeProtocol
            hadoop.hdfs.TestReadStripedFileWithMissingBlocks
            hadoop.hdfs.TestHFlush
            hadoop.hdfs.server.namenode.snapshot.TestRenameWithSnapshots
            hadoop.hdfs.server.datanode.fsdataset.impl.TestFsDatasetImpl
            hadoop.hdfs.shortcircuit.TestShortCircuitLocalRead
            hadoop.hdfs.server.namenode.TestNameNodeAcl
            hadoop.hdfs.server.blockmanagement.TestBlocksWithNotEnoughRacks
            hadoop.hdfs.TestFileAppend
            hadoop.hdfs.server.balancer.TestBalancer
            hadoop.hdfs.server.namenode.TestEditLog
            hadoop.hdfs.server.datanode.fsdataset.impl.TestScrLazyPersistFiles
            hadoop.hdfs.server.namenode.TestEditLogRace
            hadoop.hdfs.server.namenode.snapshot.TestOpenFilesWithSnapshot
            hadoop.hdfs.server.datanode.TestDirectoryScanner
          JDK v1.7.0_95 Timed out junit tests org.apache.hadoop.hdfs.TestWriteReadStripedFile
            org.apache.hadoop.hdfs.TestReadStripedFileWithDecoding



          Subsystem Report/Notes
          Docker Image:yetus/hadoop:fbe3e86
          JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12798559/HADOOP-12563.11.patch
          JIRA Issue HADOOP-12563
          Optional Tests asflicense mvnsite unit shellcheck shelldocs compile javac javadoc mvninstall findbugs checkstyle cc
          uname Linux 300e761e08fc 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
          Build tool maven
          Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh
          git revision trunk / e0cb426
          Default Java 1.7.0_95
          Multi-JDK versions /usr/lib/jvm/java-8-oracle:1.8.0_77 /usr/lib/jvm/java-7-openjdk-amd64:1.7.0_95
          shellcheck v0.4.3
          findbugs v3.0.0
          checkstyle https://builds.apache.org/job/PreCommit-HADOOP-Build/9084/artifact/patchprocess/diff-checkstyle-root.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9084/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9084/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9084/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt
          unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/9084/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9084/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9084/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt
          JDK v1.7.0_95 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/9084/testReport/
          modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: .
          Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/9084/console
          Powered by Apache Yetus 0.2.0 http://yetus.apache.org

          This message was automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - -1 overall Vote Subsystem Runtime Comment 0 reexec 0m 17s Docker mode activated. 0 shelldocs 0m 3s Shelldocs was not available. +1 @author 0m 0s The patch does not contain any @author tags. +1 test4tests 0m 0s The patch appears to include 4 new or modified test files. 0 mvndep 0m 15s Maven dependency ordering for branch +1 mvninstall 8m 19s trunk passed +1 compile 10m 56s trunk passed with JDK v1.8.0_77 +1 compile 7m 49s trunk passed with JDK v1.7.0_95 +1 checkstyle 1m 12s trunk passed +1 mvnsite 2m 7s trunk passed +1 mvneclipse 0m 31s trunk passed +1 findbugs 3m 51s trunk passed +1 javadoc 2m 22s trunk passed with JDK v1.8.0_77 +1 javadoc 3m 10s trunk passed with JDK v1.7.0_95 0 mvndep 0m 15s Maven dependency ordering for patch +1 mvninstall 1m 40s the patch passed +1 compile 8m 46s the patch passed with JDK v1.8.0_77 +1 cc 8m 46s the patch passed +1 javac 8m 46s the patch passed +1 compile 7m 56s the patch passed with JDK v1.7.0_95 +1 cc 7m 56s the patch passed +1 javac 7m 56s the patch passed -1 checkstyle 1m 5s root: patch generated 1 new + 6 unchanged - 27 fixed = 7 total (was 33) +1 mvnsite 1m 59s the patch passed +1 mvneclipse 0m 31s the patch passed +1 shellcheck 0m 10s There were no new shellcheck issues. +1 whitespace 0m 0s Patch has no whitespace issues. +1 findbugs 4m 13s the patch passed +1 javadoc 2m 23s the patch passed with JDK v1.8.0_77 +1 javadoc 3m 14s the patch passed with JDK v1.7.0_95 -1 unit 10m 4s hadoop-common in the patch failed with JDK v1.8.0_77. -1 unit 100m 20s hadoop-hdfs in the patch failed with JDK v1.8.0_77. +1 unit 10m 57s hadoop-common in the patch passed with JDK v1.7.0_95. -1 unit 100m 58s hadoop-hdfs in the patch failed with JDK v1.7.0_95. +1 asflicense 0m 32s Patch does not generate ASF License warnings. 297m 21s Reason Tests JDK v1.8.0_77 Failed junit tests hadoop.net.TestClusterTopology   hadoop.hdfs.server.datanode.TestDataNodeUUID   hadoop.hdfs.server.datanode.fsdataset.impl.TestLazyPersistPolicy   hadoop.hdfs.TestDFSStripedOutputStreamWithFailure   hadoop.hdfs.TestReadStripedFileWithMissingBlocks   hadoop.hdfs.shortcircuit.TestShortCircuitLocalRead   hadoop.hdfs.security.TestDelegationTokenForProxyUser   hadoop.hdfs.TestFileAppend   hadoop.hdfs.server.blockmanagement.TestSequentialBlockGroupId   hadoop.hdfs.server.datanode.fsdataset.impl.TestLazyPersistReplicaRecovery   hadoop.hdfs.server.datanode.TestDirectoryScanner JDK v1.8.0_77 Timed out junit tests org.apache.hadoop.hdfs.TestWriteReadStripedFile   org.apache.hadoop.hdfs.TestReadStripedFileWithDecoding JDK v1.7.0_95 Failed junit tests hadoop.hdfs.TestDFSStripedOutputStreamWithFailure   hadoop.hdfs.server.datanode.fsdataset.impl.TestInterDatanodeProtocol   hadoop.hdfs.TestReadStripedFileWithMissingBlocks   hadoop.hdfs.TestHFlush   hadoop.hdfs.server.namenode.snapshot.TestRenameWithSnapshots   hadoop.hdfs.server.datanode.fsdataset.impl.TestFsDatasetImpl   hadoop.hdfs.shortcircuit.TestShortCircuitLocalRead   hadoop.hdfs.server.namenode.TestNameNodeAcl   hadoop.hdfs.server.blockmanagement.TestBlocksWithNotEnoughRacks   hadoop.hdfs.TestFileAppend   hadoop.hdfs.server.balancer.TestBalancer   hadoop.hdfs.server.namenode.TestEditLog   hadoop.hdfs.server.datanode.fsdataset.impl.TestScrLazyPersistFiles   hadoop.hdfs.server.namenode.TestEditLogRace   hadoop.hdfs.server.namenode.snapshot.TestOpenFilesWithSnapshot   hadoop.hdfs.server.datanode.TestDirectoryScanner JDK v1.7.0_95 Timed out junit tests org.apache.hadoop.hdfs.TestWriteReadStripedFile   org.apache.hadoop.hdfs.TestReadStripedFileWithDecoding Subsystem Report/Notes Docker Image:yetus/hadoop:fbe3e86 JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12798559/HADOOP-12563.11.patch JIRA Issue HADOOP-12563 Optional Tests asflicense mvnsite unit shellcheck shelldocs compile javac javadoc mvninstall findbugs checkstyle cc uname Linux 300e761e08fc 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux Build tool maven Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh git revision trunk / e0cb426 Default Java 1.7.0_95 Multi-JDK versions /usr/lib/jvm/java-8-oracle:1.8.0_77 /usr/lib/jvm/java-7-openjdk-amd64:1.7.0_95 shellcheck v0.4.3 findbugs v3.0.0 checkstyle https://builds.apache.org/job/PreCommit-HADOOP-Build/9084/artifact/patchprocess/diff-checkstyle-root.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9084/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9084/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9084/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/9084/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9084/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9084/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt JDK v1.7.0_95 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/9084/testReport/ modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: . Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/9084/console Powered by Apache Yetus 0.2.0 http://yetus.apache.org This message was automatically generated.
          Hide
          mattpaduano Matthew Paduano added a comment -
          Diff between patch 11 and patch 12:
          
          spelling error:
          
          < +   *  a token instance that is approriate for aliasing, or null if none. */
          ---
          > +   *  a token instance that is appropriate for aliasing, or null if none. */
          
          
          
          trap possibly hard to interpret errors fetching token without proper config:
          
          > +        if (!fetcher.isTokenRequired()) {
          > +          String message = "DtFetcher for service '" + service +
          > +              " does not require a token.  Check your configuration.  " +
          > +              "Note: security may be disabled or there may be two DtFetcher "
          > +              "providers for the same service designation.";
          > +          LOG.error(message);
          > +          throw new Exception(message);
          > +        }
          
          
          Show
          mattpaduano Matthew Paduano added a comment - Diff between patch 11 and patch 12: spelling error: < + * a token instance that is approriate for aliasing, or null if none. */ --- > + * a token instance that is appropriate for aliasing, or null if none. */ trap possibly hard to interpret errors fetching token without proper config: > + if (!fetcher.isTokenRequired()) { > + String message = "DtFetcher for service '" + service + > + " does not require a token. Check your configuration. " + > + "Note: security may be disabled or there may be two DtFetcher " > + "providers for the same service designation." ; > + LOG.error(message); > + throw new Exception(message); > + }
          Hide
          raviprak Ravi Prakash added a comment -

          Your error message is missing a single quote on the other side of + service + . Patch looks good to me once you fix that. Steve Loughran any comments? I'll try to commit by end of week if none.

          Show
          raviprak Ravi Prakash added a comment - Your error message is missing a single quote on the other side of + service + . Patch looks good to me once you fix that. Steve Loughran any comments? I'll try to commit by end of week if none.
          Hide
          raviprak Ravi Prakash added a comment -

          Also, throw new IllegalArgumentException perhaps instead of the generic Exception?

          Show
          raviprak Ravi Prakash added a comment - Also, throw new IllegalArgumentException perhaps instead of the generic Exception ?
          Hide
          hadoopqa Hadoop QA added a comment -
          -1 overall



          Vote Subsystem Runtime Comment
          0 reexec 0m 17s Docker mode activated.
          0 shelldocs 0m 4s Shelldocs was not available.
          +1 @author 0m 0s The patch does not contain any @author tags.
          +1 test4tests 0m 0s The patch appears to include 4 new or modified test files.
          0 mvndep 0m 59s Maven dependency ordering for branch
          +1 mvninstall 6m 55s trunk passed
          +1 compile 5m 58s trunk passed with JDK v1.8.0_77
          +1 compile 6m 42s trunk passed with JDK v1.7.0_95
          +1 checkstyle 1m 5s trunk passed
          +1 mvnsite 1m 52s trunk passed
          +1 mvneclipse 0m 28s trunk passed
          +1 findbugs 3m 38s trunk passed
          +1 javadoc 1m 59s trunk passed with JDK v1.8.0_77
          +1 javadoc 2m 50s trunk passed with JDK v1.7.0_95
          0 mvndep 0m 14s Maven dependency ordering for patch
          +1 mvninstall 1m 30s the patch passed
          +1 compile 5m 48s the patch passed with JDK v1.8.0_77
          +1 cc 5m 48s the patch passed
          +1 javac 5m 48s the patch passed
          +1 compile 6m 47s the patch passed with JDK v1.7.0_95
          +1 cc 6m 47s the patch passed
          +1 javac 6m 47s the patch passed
          +1 checkstyle 1m 7s root: patch generated 0 new + 6 unchanged - 27 fixed = 6 total (was 33)
          +1 mvnsite 1m 49s the patch passed
          +1 mvneclipse 0m 29s the patch passed
          +1 shellcheck 0m 10s There were no new shellcheck issues.
          +1 whitespace 0m 0s Patch has no whitespace issues.
          +1 findbugs 4m 2s the patch passed
          +1 javadoc 1m 55s the patch passed with JDK v1.8.0_77
          +1 javadoc 2m 51s the patch passed with JDK v1.7.0_95
          +1 unit 8m 28s hadoop-common in the patch passed with JDK v1.8.0_77.
          -1 unit 70m 31s hadoop-hdfs in the patch failed with JDK v1.8.0_77.
          +1 unit 8m 39s hadoop-common in the patch passed with JDK v1.7.0_95.
          -1 unit 70m 22s hadoop-hdfs in the patch failed with JDK v1.7.0_95.
          +1 asflicense 0m 29s Patch does not generate ASF License warnings.
          219m 28s



          Reason Tests
          JDK v1.8.0_77 Failed junit tests hadoop.hdfs.server.namenode.TestStartup
            hadoop.hdfs.server.namenode.snapshot.TestOpenFilesWithSnapshot
            hadoop.hdfs.server.datanode.fsdataset.impl.TestFsDatasetImpl
            hadoop.hdfs.server.namenode.TestNamenodeRetryCache
            hadoop.hdfs.server.namenode.ha.TestRetryCacheWithHA
          JDK v1.7.0_95 Failed junit tests hadoop.hdfs.server.namenode.TestNamenodeRetryCache
            hadoop.hdfs.server.namenode.ha.TestRetryCacheWithHA



          Subsystem Report/Notes
          Docker Image:yetus/hadoop:fbe3e86
          JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12799774/HADOOP-12563.12.patch
          JIRA Issue HADOOP-12563
          Optional Tests asflicense mvnsite unit shellcheck shelldocs compile javac javadoc mvninstall findbugs checkstyle cc
          uname Linux 292bee6dc5d6 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
          Build tool maven
          Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh
          git revision trunk / af9bdbe
          Default Java 1.7.0_95
          Multi-JDK versions /usr/lib/jvm/java-8-oracle:1.8.0_77 /usr/lib/jvm/java-7-openjdk-amd64:1.7.0_95
          shellcheck v0.4.3
          findbugs v3.0.0
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9129/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9129/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt
          unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/9129/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9129/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt
          JDK v1.7.0_95 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/9129/testReport/
          modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: .
          Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/9129/console
          Powered by Apache Yetus 0.2.0 http://yetus.apache.org

          This message was automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - -1 overall Vote Subsystem Runtime Comment 0 reexec 0m 17s Docker mode activated. 0 shelldocs 0m 4s Shelldocs was not available. +1 @author 0m 0s The patch does not contain any @author tags. +1 test4tests 0m 0s The patch appears to include 4 new or modified test files. 0 mvndep 0m 59s Maven dependency ordering for branch +1 mvninstall 6m 55s trunk passed +1 compile 5m 58s trunk passed with JDK v1.8.0_77 +1 compile 6m 42s trunk passed with JDK v1.7.0_95 +1 checkstyle 1m 5s trunk passed +1 mvnsite 1m 52s trunk passed +1 mvneclipse 0m 28s trunk passed +1 findbugs 3m 38s trunk passed +1 javadoc 1m 59s trunk passed with JDK v1.8.0_77 +1 javadoc 2m 50s trunk passed with JDK v1.7.0_95 0 mvndep 0m 14s Maven dependency ordering for patch +1 mvninstall 1m 30s the patch passed +1 compile 5m 48s the patch passed with JDK v1.8.0_77 +1 cc 5m 48s the patch passed +1 javac 5m 48s the patch passed +1 compile 6m 47s the patch passed with JDK v1.7.0_95 +1 cc 6m 47s the patch passed +1 javac 6m 47s the patch passed +1 checkstyle 1m 7s root: patch generated 0 new + 6 unchanged - 27 fixed = 6 total (was 33) +1 mvnsite 1m 49s the patch passed +1 mvneclipse 0m 29s the patch passed +1 shellcheck 0m 10s There were no new shellcheck issues. +1 whitespace 0m 0s Patch has no whitespace issues. +1 findbugs 4m 2s the patch passed +1 javadoc 1m 55s the patch passed with JDK v1.8.0_77 +1 javadoc 2m 51s the patch passed with JDK v1.7.0_95 +1 unit 8m 28s hadoop-common in the patch passed with JDK v1.8.0_77. -1 unit 70m 31s hadoop-hdfs in the patch failed with JDK v1.8.0_77. +1 unit 8m 39s hadoop-common in the patch passed with JDK v1.7.0_95. -1 unit 70m 22s hadoop-hdfs in the patch failed with JDK v1.7.0_95. +1 asflicense 0m 29s Patch does not generate ASF License warnings. 219m 28s Reason Tests JDK v1.8.0_77 Failed junit tests hadoop.hdfs.server.namenode.TestStartup   hadoop.hdfs.server.namenode.snapshot.TestOpenFilesWithSnapshot   hadoop.hdfs.server.datanode.fsdataset.impl.TestFsDatasetImpl   hadoop.hdfs.server.namenode.TestNamenodeRetryCache   hadoop.hdfs.server.namenode.ha.TestRetryCacheWithHA JDK v1.7.0_95 Failed junit tests hadoop.hdfs.server.namenode.TestNamenodeRetryCache   hadoop.hdfs.server.namenode.ha.TestRetryCacheWithHA Subsystem Report/Notes Docker Image:yetus/hadoop:fbe3e86 JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12799774/HADOOP-12563.12.patch JIRA Issue HADOOP-12563 Optional Tests asflicense mvnsite unit shellcheck shelldocs compile javac javadoc mvninstall findbugs checkstyle cc uname Linux 292bee6dc5d6 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux Build tool maven Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh git revision trunk / af9bdbe Default Java 1.7.0_95 Multi-JDK versions /usr/lib/jvm/java-8-oracle:1.8.0_77 /usr/lib/jvm/java-7-openjdk-amd64:1.7.0_95 shellcheck v0.4.3 findbugs v3.0.0 unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9129/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9129/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/9129/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9129/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt JDK v1.7.0_95 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/9129/testReport/ modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: . Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/9129/console Powered by Apache Yetus 0.2.0 http://yetus.apache.org This message was automatically generated.
          Hide
          stevel@apache.org Steve Loughran added a comment -

          been to busy too look. Let's get in and evolve it in place if needed

          Show
          stevel@apache.org Steve Loughran added a comment - been to busy too look. Let's get in and evolve it in place if needed
          Hide
          mattpaduano Matthew Paduano added a comment -
          patch 12-13 diff:
          
          < +              " does not require a token.  Check your configuration.  " +
          ---
          > +              "' does not require a token.  Check your configuration.  " +
           
          < +          throw new Exception(message);
          ---
          > +          throw new IllegalArgumentException(message);
          
          
          
          Show
          mattpaduano Matthew Paduano added a comment - patch 12-13 diff: < + " does not require a token. Check your configuration. " + --- > + "' does not require a token. Check your configuration. " + < + throw new Exception(message); --- > + throw new IllegalArgumentException(message);
          Hide
          raviprak Ravi Prakash added a comment -

          Thanks Matt for all your work and for your deep insight Steve! Committing shortly

          Show
          raviprak Ravi Prakash added a comment - Thanks Matt for all your work and for your deep insight Steve! Committing shortly
          Hide
          raviprak Ravi Prakash added a comment -

          +1. LGTM. Committed to trunk

          Show
          raviprak Ravi Prakash added a comment - +1. LGTM. Committed to trunk
          Hide
          hudson Hudson added a comment -

          FAILURE: Integrated in Hadoop-trunk-Commit #9646 (See https://builds.apache.org/job/Hadoop-trunk-Commit/9646/)
          HADOOP-12563. Updated utility (dtutil) to create/modify token files. (raviprak: rev 4838b735f0d472765f402fe6b1c8b6ce85b9fbf1)

          • hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/DtFetcher.java
          • hadoop-common-project/hadoop-common/src/main/proto/Security.proto
          • hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/Credentials.java
          • hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/token/TestDtUtilShell.java
          • hadoop-common-project/hadoop-common/src/test/resources/META-INF/services/org.apache.hadoop.security.token.DtFetcher
          • hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/tools/CommandShell.java
          • hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java
          • hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/DelegationTokenFetcher.java
          • hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/WebHdfsDtFetcher.java
          • hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/HdfsDtFetcher.java
          • hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/tools/TestCommandShell.java
          • hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/DtFileOperations.java
          • hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/DtUtilShell.java
          • hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/token/TestDtFetcher.java
          • hadoop-common-project/hadoop-common/src/main/bin/hadoop
          • hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/SWebHdfsDtFetcher.java
          • hadoop-hdfs-project/hadoop-hdfs/src/main/resources/META-INF/services/org.apache.hadoop.security.token.DtFetcher
          • hadoop-common-project/hadoop-common/src/site/markdown/CommandsManual.md
          Show
          hudson Hudson added a comment - FAILURE: Integrated in Hadoop-trunk-Commit #9646 (See https://builds.apache.org/job/Hadoop-trunk-Commit/9646/ ) HADOOP-12563 . Updated utility (dtutil) to create/modify token files. (raviprak: rev 4838b735f0d472765f402fe6b1c8b6ce85b9fbf1) hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/DtFetcher.java hadoop-common-project/hadoop-common/src/main/proto/Security.proto hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/Credentials.java hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/token/TestDtUtilShell.java hadoop-common-project/hadoop-common/src/test/resources/META-INF/services/org.apache.hadoop.security.token.DtFetcher hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/tools/CommandShell.java hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/DelegationTokenFetcher.java hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/WebHdfsDtFetcher.java hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/HdfsDtFetcher.java hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/tools/TestCommandShell.java hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/DtFileOperations.java hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/DtUtilShell.java hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/token/TestDtFetcher.java hadoop-common-project/hadoop-common/src/main/bin/hadoop hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/SWebHdfsDtFetcher.java hadoop-hdfs-project/hadoop-hdfs/src/main/resources/META-INF/services/org.apache.hadoop.security.token.DtFetcher hadoop-common-project/hadoop-common/src/site/markdown/CommandsManual.md
          Hide
          hadoopqa Hadoop QA added a comment -
          -1 overall



          Vote Subsystem Runtime Comment
          0 reexec 0m 10s Docker mode activated.
          0 shelldocs 0m 4s Shelldocs was not available.
          +1 @author 0m 0s The patch does not contain any @author tags.
          +1 test4tests 0m 0s The patch appears to include 4 new or modified test files.
          0 mvndep 0m 13s Maven dependency ordering for branch
          +1 mvninstall 6m 33s trunk passed
          +1 compile 7m 31s trunk passed with JDK v1.8.0_77
          +1 compile 7m 13s trunk passed with JDK v1.7.0_95
          +1 checkstyle 1m 7s trunk passed
          +1 mvnsite 1m 51s trunk passed
          +1 mvneclipse 0m 27s trunk passed
          +1 findbugs 3m 25s trunk passed
          +1 javadoc 1m 55s trunk passed with JDK v1.8.0_77
          +1 javadoc 2m 47s trunk passed with JDK v1.7.0_95
          0 mvndep 0m 13s Maven dependency ordering for patch
          +1 mvninstall 1m 27s the patch passed
          +1 compile 6m 49s the patch passed with JDK v1.8.0_77
          +1 cc 6m 49s the patch passed
          +1 javac 6m 49s the patch passed
          +1 compile 7m 18s the patch passed with JDK v1.7.0_95
          +1 cc 7m 18s the patch passed
          +1 javac 7m 18s the patch passed
          +1 checkstyle 1m 5s root: patch generated 0 new + 7 unchanged - 27 fixed = 7 total (was 34)
          +1 mvnsite 1m 57s the patch passed
          +1 mvneclipse 0m 26s the patch passed
          +1 shellcheck 0m 9s There were no new shellcheck issues.
          +1 whitespace 0m 0s Patch has no whitespace issues.
          +1 findbugs 3m 56s the patch passed
          +1 javadoc 1m 57s the patch passed with JDK v1.8.0_77
          +1 javadoc 2m 50s the patch passed with JDK v1.7.0_95
          -1 unit 7m 0s hadoop-common in the patch failed with JDK v1.8.0_77.
          -1 unit 62m 37s hadoop-hdfs in the patch failed with JDK v1.8.0_77.
          -1 unit 7m 5s hadoop-common in the patch failed with JDK v1.7.0_95.
          -1 unit 55m 56s hadoop-hdfs in the patch failed with JDK v1.7.0_95.
          +1 asflicense 0m 24s Patch does not generate ASF License warnings.
          195m 45s



          Reason Tests
          JDK v1.8.0_77 Failed junit tests hadoop.net.TestDNS
            hadoop.metrics2.impl.TestGangliaMetrics
            hadoop.hdfs.server.datanode.TestDataNodeUUID
            hadoop.hdfs.security.TestDelegationTokenForProxyUser
            hadoop.hdfs.server.namenode.TestEditLog
          JDK v1.7.0_95 Failed junit tests hadoop.metrics2.impl.TestGangliaMetrics
            hadoop.hdfs.server.datanode.fsdataset.impl.TestFsDatasetImpl
            hadoop.hdfs.TestHFlush
            hadoop.hdfs.TestDFSClientRetries
            hadoop.hdfs.server.namenode.TestEditLog
            hadoop.hdfs.server.balancer.TestBalancer
            hadoop.hdfs.TestEncryptionZones



          Subsystem Report/Notes
          Docker Image:yetus/hadoop:fbe3e86
          JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12799992/HADOOP-12563.13.patch
          JIRA Issue HADOOP-12563
          Optional Tests asflicense mvnsite unit shellcheck shelldocs compile javac javadoc mvninstall findbugs checkstyle cc
          uname Linux f4e15e0057b6 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
          Build tool maven
          Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh
          git revision trunk / 7da5847
          Default Java 1.7.0_95
          Multi-JDK versions /usr/lib/jvm/java-8-oracle:1.8.0_77 /usr/lib/jvm/java-7-openjdk-amd64:1.7.0_95
          shellcheck v0.4.3
          findbugs v3.0.0
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_95.txt
          unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt
          unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_95.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt
          JDK v1.7.0_95 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/testReport/
          modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: .
          Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/console
          Powered by Apache Yetus 0.2.0 http://yetus.apache.org

          This message was automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - -1 overall Vote Subsystem Runtime Comment 0 reexec 0m 10s Docker mode activated. 0 shelldocs 0m 4s Shelldocs was not available. +1 @author 0m 0s The patch does not contain any @author tags. +1 test4tests 0m 0s The patch appears to include 4 new or modified test files. 0 mvndep 0m 13s Maven dependency ordering for branch +1 mvninstall 6m 33s trunk passed +1 compile 7m 31s trunk passed with JDK v1.8.0_77 +1 compile 7m 13s trunk passed with JDK v1.7.0_95 +1 checkstyle 1m 7s trunk passed +1 mvnsite 1m 51s trunk passed +1 mvneclipse 0m 27s trunk passed +1 findbugs 3m 25s trunk passed +1 javadoc 1m 55s trunk passed with JDK v1.8.0_77 +1 javadoc 2m 47s trunk passed with JDK v1.7.0_95 0 mvndep 0m 13s Maven dependency ordering for patch +1 mvninstall 1m 27s the patch passed +1 compile 6m 49s the patch passed with JDK v1.8.0_77 +1 cc 6m 49s the patch passed +1 javac 6m 49s the patch passed +1 compile 7m 18s the patch passed with JDK v1.7.0_95 +1 cc 7m 18s the patch passed +1 javac 7m 18s the patch passed +1 checkstyle 1m 5s root: patch generated 0 new + 7 unchanged - 27 fixed = 7 total (was 34) +1 mvnsite 1m 57s the patch passed +1 mvneclipse 0m 26s the patch passed +1 shellcheck 0m 9s There were no new shellcheck issues. +1 whitespace 0m 0s Patch has no whitespace issues. +1 findbugs 3m 56s the patch passed +1 javadoc 1m 57s the patch passed with JDK v1.8.0_77 +1 javadoc 2m 50s the patch passed with JDK v1.7.0_95 -1 unit 7m 0s hadoop-common in the patch failed with JDK v1.8.0_77. -1 unit 62m 37s hadoop-hdfs in the patch failed with JDK v1.8.0_77. -1 unit 7m 5s hadoop-common in the patch failed with JDK v1.7.0_95. -1 unit 55m 56s hadoop-hdfs in the patch failed with JDK v1.7.0_95. +1 asflicense 0m 24s Patch does not generate ASF License warnings. 195m 45s Reason Tests JDK v1.8.0_77 Failed junit tests hadoop.net.TestDNS   hadoop.metrics2.impl.TestGangliaMetrics   hadoop.hdfs.server.datanode.TestDataNodeUUID   hadoop.hdfs.security.TestDelegationTokenForProxyUser   hadoop.hdfs.server.namenode.TestEditLog JDK v1.7.0_95 Failed junit tests hadoop.metrics2.impl.TestGangliaMetrics   hadoop.hdfs.server.datanode.fsdataset.impl.TestFsDatasetImpl   hadoop.hdfs.TestHFlush   hadoop.hdfs.TestDFSClientRetries   hadoop.hdfs.server.namenode.TestEditLog   hadoop.hdfs.server.balancer.TestBalancer   hadoop.hdfs.TestEncryptionZones Subsystem Report/Notes Docker Image:yetus/hadoop:fbe3e86 JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12799992/HADOOP-12563.13.patch JIRA Issue HADOOP-12563 Optional Tests asflicense mvnsite unit shellcheck shelldocs compile javac javadoc mvninstall findbugs checkstyle cc uname Linux f4e15e0057b6 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux Build tool maven Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh git revision trunk / 7da5847 Default Java 1.7.0_95 Multi-JDK versions /usr/lib/jvm/java-8-oracle:1.8.0_77 /usr/lib/jvm/java-7-openjdk-amd64:1.7.0_95 shellcheck v0.4.3 findbugs v3.0.0 unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_95.txt unit https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt unit test logs https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.8.0_77.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/artifact/patchprocess/patch-unit-hadoop-common-project_hadoop-common-jdk1.7.0_95.txt https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/artifact/patchprocess/patch-unit-hadoop-hdfs-project_hadoop-hdfs-jdk1.7.0_95.txt JDK v1.7.0_95 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/testReport/ modules C: hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs U: . Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/9143/console Powered by Apache Yetus 0.2.0 http://yetus.apache.org This message was automatically generated.
          Hide
          brahmareddy Brahma Reddy Battula added a comment -

          Following test fails after this in....Since jenkins did not run on YARN and MAPREDUCE projects, these tests are missed..

          TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart:2535 » IllegalState
          TestContainerManagerRecovery.testApplicationRecovery:189->startContainer:511 » IllegalState
          TestContainerManagerRecovery.testContainerCleanupOnShutdown:412->startContainer:511 » IllegalState
          TestContainerManagerRecovery.testContainerResizeRecovery:351->startContainer:511 » IllegalState

          See https://builds.apache.org/job/Hadoop-Yarn-trunk/2051/

          FAILED:  org.apache.hadoop.yarn.server.nodemanager.containermanager.TestContainerManagerRecovery.testApplicationRecovery
          
          Error Message:
          InputStream#read(byte[]) returned invalid result: 0 The InputStream implementation is buggy.
          
          Stack Trace:
          java.lang.IllegalStateException: InputStream#read(byte[]) returned invalid result: 0 The InputStream implementation is buggy.
          	at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:739)
          	at com.google.protobuf.CodedInputStream.isAtEnd(CodedInputStream.java:701)
          	at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:99)
          	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1828)
          	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1792)
          	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1892)
          	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1887)
          	at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200)
          	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217)
          	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223)
          	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
          	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.parseFrom(SecurityProtos.java:2100)
          	at org.apache.hadoop.security.Credentials.readProtos(Credentials.java:331)
          	at org.apache.hadoop.security.Credentials.readTokenStorageStream(Credentials.java:226)
          	at org.apache.hadoop.yarn.server.utils.YarnServerSecurityUtils.parseCredentials(YarnServerSecurityUtils.java:131)
          	at org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl.startContainerInternal(ContainerManagerImpl.java:924)
          	at org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl.startContainers(ContainerManagerImpl.java:815)
          	at org.apache.hadoop.yarn.server.nodemanager.containermanager.TestContainerManagerRecovery$3.run(TestContainerManagerRecovery.java:514)
          	at org.apache.hadoop.yarn.server.nodemanager.containermanager.TestContainerManagerRecovery$3.run(TestContainerManagerRecovery.java:511)
          	at java.security.AccessController.doPrivileged(Native Method)
          	at javax.security.auth.Subject.doAs(Subject.java:415)
          	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
          	at org.apache.hadoop.yarn.server.nodemanager.containermanager.TestContainerManagerRecovery.startContainer(TestContainerManagerRecovery.java:511)
          	at org.apache.hadoop.yarn.server.nodemanager.containermanager.TestContainerManagerRecovery.testApplicationRecovery(TestContainerManagerRecovery.java:189)
          
          Show
          brahmareddy Brahma Reddy Battula added a comment - Following test fails after this in....Since jenkins did not run on YARN and MAPREDUCE projects, these tests are missed.. TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart:2535 » IllegalState TestContainerManagerRecovery.testApplicationRecovery:189->startContainer:511 » IllegalState TestContainerManagerRecovery.testContainerCleanupOnShutdown:412->startContainer:511 » IllegalState TestContainerManagerRecovery.testContainerResizeRecovery:351->startContainer:511 » IllegalState See https://builds.apache.org/job/Hadoop-Yarn-trunk/2051/ FAILED: org.apache.hadoop.yarn.server.nodemanager.containermanager.TestContainerManagerRecovery.testApplicationRecovery Error Message: InputStream#read(byte[]) returned invalid result: 0 The InputStream implementation is buggy. Stack Trace: java.lang.IllegalStateException: InputStream#read(byte[]) returned invalid result: 0 The InputStream implementation is buggy. at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:739) at com.google.protobuf.CodedInputStream.isAtEnd(CodedInputStream.java:701) at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:99) at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1828) at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1792) at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1892) at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1887) at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200) at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217) at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223) at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49) at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.parseFrom(SecurityProtos.java:2100) at org.apache.hadoop.security.Credentials.readProtos(Credentials.java:331) at org.apache.hadoop.security.Credentials.readTokenStorageStream(Credentials.java:226) at org.apache.hadoop.yarn.server.utils.YarnServerSecurityUtils.parseCredentials(YarnServerSecurityUtils.java:131) at org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl.startContainerInternal(ContainerManagerImpl.java:924) at org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl.startContainers(ContainerManagerImpl.java:815) at org.apache.hadoop.yarn.server.nodemanager.containermanager.TestContainerManagerRecovery$3.run(TestContainerManagerRecovery.java:514) at org.apache.hadoop.yarn.server.nodemanager.containermanager.TestContainerManagerRecovery$3.run(TestContainerManagerRecovery.java:511) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742) at org.apache.hadoop.yarn.server.nodemanager.containermanager.TestContainerManagerRecovery.startContainer(TestContainerManagerRecovery.java:511) at org.apache.hadoop.yarn.server.nodemanager.containermanager.TestContainerManagerRecovery.testApplicationRecovery(TestContainerManagerRecovery.java:189)
          Hide
          stevel@apache.org Steve Loughran added a comment -

          OK what to do? Ravi, Matt: can you look at this today? Otherwise it should be rolled back to get jenkins happy and then resubmitted next week

          Show
          stevel@apache.org Steve Loughran added a comment - OK what to do? Ravi, Matt: can you look at this today? Otherwise it should be rolled back to get jenkins happy and then resubmitted next week
          Hide
          bibinchundatt Bibin A Chundatt added a comment -

          For cases when secretKeysMap and tokenMap is empty the tests are failing. Will writeDelimitedTo during write and parseDelimitedFrom during read help?

          Show
          bibinchundatt Bibin A Chundatt added a comment - For cases when secretKeysMap and tokenMap is empty the tests are failing. Will writeDelimitedTo during write and parseDelimitedFrom during read help?
          Hide
          mattpaduano Matthew Paduano added a comment -

          I can look at this today/soon.

          Show
          mattpaduano Matthew Paduano added a comment - I can look at this today/soon.
          Hide
          mattpaduano Matthew Paduano added a comment -

          thanks for that suggestion about delimited IO.
          that seems to fix the tests and original tests still pass.

          I will work with Ravi to get this patched up.

          Show
          mattpaduano Matthew Paduano added a comment - thanks for that suggestion about delimited IO. that seems to fix the tests and original tests still pass. I will work with Ravi to get this patched up.
          Hide
          raviprak Ravi Prakash added a comment -

          Reverting in the meantime. Matt please create a new patch. Thanks for the heads up Brahma and Bibin

          Show
          raviprak Ravi Prakash added a comment - Reverting in the meantime. Matt please create a new patch. Thanks for the heads up Brahma and Bibin
          Hide
          mattpaduano Matthew Paduano added a comment -

          please see HADOOP-13054 where I attached a patch to fix this issue.

          Show
          mattpaduano Matthew Paduano added a comment - please see HADOOP-13054 where I attached a patch to fix this issue.
          Hide
          hudson Hudson added a comment -

          FAILURE: Integrated in Hadoop-trunk-Commit #9655 (See https://builds.apache.org/job/Hadoop-trunk-Commit/9655/)
          Revert "HADOOP-12563. Updated utility (dtutil) to create/modify token (raviprak: rev d6402fadedade4289949ba9f70f7a0bfb9bca140)

          • hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/Credentials.java
          • hadoop-common-project/hadoop-common/src/site/markdown/CommandsManual.md
          • hadoop-common-project/hadoop-common/src/main/proto/Security.proto
          • hadoop-common-project/hadoop-common/src/test/resources/META-INF/services/org.apache.hadoop.security.token.DtFetcher
          • hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/DtUtilShell.java
          • hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/DtFetcher.java
          • hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/tools/CommandShell.java
          • hadoop-common-project/hadoop-common/src/main/bin/hadoop
          • hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/DelegationTokenFetcher.java
          • hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/token/TestDtUtilShell.java
          • hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/DtFileOperations.java
          • hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/HdfsDtFetcher.java
          • hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/SWebHdfsDtFetcher.java
          • hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/tools/TestCommandShell.java
          • hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/token/TestDtFetcher.java
          • hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/WebHdfsDtFetcher.java
          • hadoop-hdfs-project/hadoop-hdfs/src/main/resources/META-INF/services/org.apache.hadoop.security.token.DtFetcher
          • hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java
          Show
          hudson Hudson added a comment - FAILURE: Integrated in Hadoop-trunk-Commit #9655 (See https://builds.apache.org/job/Hadoop-trunk-Commit/9655/ ) Revert " HADOOP-12563 . Updated utility (dtutil) to create/modify token (raviprak: rev d6402fadedade4289949ba9f70f7a0bfb9bca140) hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/Credentials.java hadoop-common-project/hadoop-common/src/site/markdown/CommandsManual.md hadoop-common-project/hadoop-common/src/main/proto/Security.proto hadoop-common-project/hadoop-common/src/test/resources/META-INF/services/org.apache.hadoop.security.token.DtFetcher hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/DtUtilShell.java hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/DtFetcher.java hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/tools/CommandShell.java hadoop-common-project/hadoop-common/src/main/bin/hadoop hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/DelegationTokenFetcher.java hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/token/TestDtUtilShell.java hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/DtFileOperations.java hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/HdfsDtFetcher.java hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/SWebHdfsDtFetcher.java hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/tools/TestCommandShell.java hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/token/TestDtFetcher.java hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/WebHdfsDtFetcher.java hadoop-hdfs-project/hadoop-hdfs/src/main/resources/META-INF/services/org.apache.hadoop.security.token.DtFetcher hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java
          Hide
          mattpaduano Matthew Paduano added a comment -

          using writeDelimitedTo/readDelimitedFrom in the proto IO:

          --- a/HADOOP-12563.13.patch
          +++ b/HADOOP-12563.14.patch
          @@ -322,7 +322,7 @@ index e6b8722..662eb3e 100644
           +          setSecret(ByteString.copyFrom(e.getValue()));
           +      storage.addSecrets(kv.build());
           +    }
          -+    storage.build().writeTo((DataOutputStream)out);
          ++    storage.build().writeDelimitedTo((DataOutputStream)out);
           +  }
           +
              /**
          @@ -331,7 +331,7 @@ index e6b8722..662eb3e 100644
           +   * @param in - stream ready to read a serialized proto buffer message
           +   */
           +  public void readProtos(DataInput in) throws IOException {
          -+    CredentialsProto storage = CredentialsProto.parseFrom((DataInputStream)in)
          ++    CredentialsProto storage = CredentialsProto.parseDelimitedFrom((DataInputS
           +    for (CredentialsKVProto kv : storage.getTokensList()) {
           +      addToken(new Text(kv.getAliasBytes().toByteArray()),
           +               (Token<? extends TokenIdentifier>) new Token(kv.getToken()));
          
          Show
          mattpaduano Matthew Paduano added a comment - using writeDelimitedTo/readDelimitedFrom in the proto IO: --- a/HADOOP-12563.13.patch +++ b/HADOOP-12563.14.patch @@ -322,7 +322,7 @@ index e6b8722..662eb3e 100644 + setSecret(ByteString.copyFrom(e.getValue())); + storage.addSecrets(kv.build()); + } -+ storage.build().writeTo((DataOutputStream)out); ++ storage.build().writeDelimitedTo((DataOutputStream)out); + } + /** @@ -331,7 +331,7 @@ index e6b8722..662eb3e 100644 + * @param in - stream ready to read a serialized proto buffer message + */ + public void readProtos(DataInput in) throws IOException { -+ CredentialsProto storage = CredentialsProto.parseFrom((DataInputStream)in) ++ CredentialsProto storage = CredentialsProto.parseDelimitedFrom((DataInputS + for (CredentialsKVProto kv : storage.getTokensList()) { + addToken( new Text(kv.getAliasBytes().toByteArray()), + (Token<? extends TokenIdentifier>) new Token(kv.getToken()));
          Hide
          bibinchundatt Bibin A Chundatt added a comment -

          Matthew Paduano

          1. Could you please add testcases for empty maps also in new patch.
          2. Testcases doesnt seem to run in windows could you check that too.
          Show
          bibinchundatt Bibin A Chundatt added a comment - Matthew Paduano Could you please add testcases for empty maps also in new patch. Testcases doesnt seem to run in windows could you check that too.
          Hide
          mattpaduano Matthew Paduano added a comment -

          diff of patch 14,15

          218c218
          < +      readProto(in);
          ---
          > +      readProtos(in);
          333c333
          < +  public void readProto(DataInput in) throws IOException {
          ---
          > +  public void readProtos(DataInput in) throws IOException {
          
          
          
          <  public class TestCredentials {
          ...
          < +  @Test
          < +  public void testBasicReadWriteProtoEmpty()
          < +      throws IOException, NoSuchAlgorithmException {
          < +    String testname ="testBasicReadWriteProtoEmpty";
          < +    Credentials ts = new Credentials();
          < +    writeCredentialsProto(ts, testname);
          < +    Credentials ts2 = readCredentialsProto(testname);
          < +    assertEquals("test empty tokens", 0, ts2.numberOfTokens());
          < +    assertEquals("test empty keys", 0, ts2.numberOfSecretKeys());
          < +  }
          < +
          < +  @Test
          < +  public void testBasicReadWriteProto()
          < +      throws IOException, NoSuchAlgorithmException {
          < +    String testname ="testBasicReadWriteProto";
          < +    Text tok1 = new Text("token1");
          < +    Text tok2 = new Text("token2");
          < +    Text key1 = new Text("key1");
          < +    Credentials ts = generateCredentials(tok1, tok2, key1);
          < +    writeCredentialsProto(ts, testname);
          < +    Credentials ts2 = readCredentialsProto(testname);
          < +    assertCredentials(testname, tok1, key1, ts, ts2);
          < +    assertCredentials(testname, tok2, key1, ts, ts2);
          < +  }
          < +
          < +  @Test
          < +  public void testBasicReadWriteStreamEmpty()
          < +      throws IOException, NoSuchAlgorithmException {
          < +    String testname ="testBasicReadWriteStreamEmpty";
          < +    Credentials ts = new Credentials();
          < +    writeCredentialsStream(ts, testname);
          < +    Credentials ts2 = readCredentialsStream(testname);
          < +    assertEquals("test empty tokens", 0, ts2.numberOfTokens());
          < +    assertEquals("test empty keys", 0, ts2.numberOfSecretKeys());
          < +  }
          < +
          < +  @Test
          < +  public void testBasicReadWriteStream()
          < +      throws IOException, NoSuchAlgorithmException {
          < +    String testname ="testBasicReadWriteStream";
          < +    Text tok1 = new Text("token1");
          < +    Text tok2 = new Text("token2");
          < +    Text key1 = new Text("key1");
          < +    Credentials ts = generateCredentials(tok1, tok2, key1);
          < +    writeCredentialsStream(ts, testname);
          < +    Credentials ts2 = readCredentialsStream(testname);
          < +    assertCredentials(testname, tok1, key1, ts, ts2);
          < +    assertCredentials(testname, tok2, key1, ts, ts2);
          < +  }
          < +
          < +  @Test
          < +  /**
          < +   * Verify the suitability of read/writeProto for use with Writable interfac
          < +   * This test uses only empty credentials.
          < +   */
          < +  public void testWritablePropertiesEmpty()
          < +      throws IOException, NoSuchAlgorithmException {
          < +    String testname ="testWritablePropertiesEmpty";
          < +    Credentials ts = new Credentials();
          < +    Credentials ts2 = new Credentials();
          < +    writeCredentialsProtos(ts, ts2, testname);
          < +    List<Credentials> clist = readCredentialsProtos(testname);
          < +    assertEquals("test empty tokens 0", 0, clist.get(0).numberOfTokens());
          < +    assertEquals("test empty keys 0", 0, clist.get(0).numberOfSecretKeys());
          < +    assertEquals("test empty tokens 1", 0, clist.get(1).numberOfTokens());
          < +    assertEquals("test empty keys 1", 0, clist.get(1).numberOfSecretKeys());
          < +  }
          < +
          < +  @Test
          < +  /**
          < +   * Verify the suitability of read/writeProto for use with Writable interfac
          < +   */
          < +  public void testWritableProperties()
          < +      throws IOException, NoSuchAlgorithmException {
          < +    String testname ="testWritableProperties";
          < +    Text tok1 = new Text("token1");
          < +    Text tok2 = new Text("token2");
          < +    Text key1 = new Text("key1");
          < +    Credentials ts = generateCredentials(tok1, tok2, key1);
          < +    Text tok3 = new Text("token3");
          < +    Text key2 = new Text("key2");
          < +    Credentials ts2 = generateCredentials(tok1, tok3, key2);
          < +    writeCredentialsProtos(ts, ts2, testname);
          < +    List<Credentials> clist = readCredentialsProtos(testname);
          < +    assertCredentials(testname, tok1, key1, ts, clist.get(0));
          < +    assertCredentials(testname, tok2, key1, ts, clist.get(0));
          < +    assertCredentials(testname, tok1, key2, ts2, clist.get(1));
          < +    assertCredentials(testname, tok3, key2, ts2, clist.get(1));
          < +  }
          < +
          < +  private Credentials generateCredentials(Text t1, Text t2, Text t3)
          < +      throws NoSuchAlgorithmException {
          < +    Text kind = new Text("TESTTOK");
          < +    byte[] id1 = {0x69, 0x64, 0x65, 0x6e, 0x74, 0x69, 0x66, 0x69, 0x65, 0x72}
          < +    byte[] pass1 = {0x70, 0x61, 0x73, 0x73, 0x77, 0x6f, 0x72, 0x64};
          < +    byte[] id2 = {0x68, 0x63, 0x64, 0x6d, 0x73, 0x68, 0x65, 0x68, 0x64, 0x71}
          < +    byte[] pass2 = {0x6f, 0x60, 0x72, 0x72, 0x76, 0x6e, 0x71, 0x63};
          < +    Credentials ts = new Credentials();
          < +    generateToken(ts, id1, pass1, kind, t1);
          < +    generateToken(ts, id2, pass2, kind, t2);
          < +    generateKey(ts, t3);
          < +    return ts;
          < +  }
          < +
          < +  private void assertCredentials(String tag, Text alias, Text keykey,
          < +                                 Credentials a, Credentials b) {
          < +    assertEquals(tag + ": test token count", a.numberOfTokens(),
          < +                                             b.numberOfTokens());
          < +    assertEquals(tag + ": test service", a.getToken(alias).getService(),
          < +                                         b.getToken(alias).getService());
          < +    assertEquals(tag + ": test kind", a.getToken(alias).getKind(),
          < +                                      b.getToken(alias).getKind());
          < +    assertTrue(tag + ": test password",
          < +        Arrays.equals(a.getToken(alias).getPassword(),
          < +                      b.getToken(alias).getPassword()));
          < +    assertTrue(tag + ": test identifier",
          < +        Arrays.equals(a.getToken(alias).getIdentifier(),
          < +                      b.getToken(alias).getIdentifier()));
          < +    assertEquals(tag + ": test number of keys", a.numberOfSecretKeys(),
          < +                                                b.numberOfSecretKeys());
          < +    assertTrue(tag + ":test key values", Arrays.equals(a.getSecretKey(keykey)
          < +                                                       b.getSecretKey(keykey)
          < +  }
          < +
          < +  private void writeCredentialsStream(Credentials creds, String filename)
          < +      throws IOException, FileNotFoundException {
          < +    DataOutputStream dos = new DataOutputStream(
          < +        new FileOutputStream(new File(tmpDir, filename)));
          < +    creds.writeTokenStorageToStream(dos);
          < +  }
          < +
          < +  private Credentials readCredentialsStream(String filename)
          < +      throws IOException, FileNotFoundException {
          < +    Credentials creds = new Credentials();
          < +    DataInputStream dis = new DataInputStream(
          < +        new FileInputStream(new File(tmpDir, filename)));
          < +    creds.readTokenStorageStream(dis);
          < +    return creds;
          < +  }
          < +
          < +  private void writeCredentialsProto(Credentials creds, String filename)
          < +      throws IOException, FileNotFoundException {
          < +    DataOutputStream dos = new DataOutputStream(
          < +        new FileOutputStream(new File(tmpDir, filename)));
          < +    creds.writeProto(dos);
          < +  }
          < +
          < +  private Credentials readCredentialsProto(String filename)
          < +      throws IOException, FileNotFoundException {
          < +    Credentials creds = new Credentials();
          < +    DataInputStream dis = new DataInputStream(
          < +        new FileInputStream(new File(tmpDir, filename)));
          < +    creds.readProto(dis);
          < +    return creds;
          < +  }
          < +
          < +  private void writeCredentialsProtos(Credentials c1, Credentials c2,
          < +      String filename) throws IOException, FileNotFoundException {
          < +    DataOutputStream dos = new DataOutputStream(
          < +        new FileOutputStream(new File(tmpDir, filename)));
          < +    c1.writeProto(dos);
          < +    c2.writeProto(dos);
          < +  }
          < +
          < +  private List<Credentials> readCredentialsProtos(String filename)
          < +      throws IOException, FileNotFoundException {
          < +    Credentials c1 = new Credentials();
          < +    Credentials c2 = new Credentials();
          < +    DataInputStream dis = new DataInputStream(
          < +        new FileInputStream(new File(tmpDir, filename)));
          < +    c1.readProto(dis);
          < +    c2.readProto(dis);
          < +    List<Credentials> r = new ArrayList<Credentials>(2);
          < +    r.add(0, c1);
          < +    r.add(1, c2);
          < +    return r;
          < +  }
          < +
          < +  private <T extends TokenIdentifier> void generateToken(
          < +      Credentials creds, byte[] ident, byte[] pass, Text kind, Text service) 
          < +    Token<T> token = new Token(ident, pass, kind, service);
          < +    creds.addToken(service, token);
          < +  }
          < +
          < +  private void generateKey(Credentials creds, Text alias)
          < +      throws NoSuchAlgorithmException {
          < +    final KeyGenerator kg = KeyGenerator.getInstance(DEFAULT_HMAC_ALGORITHM);
          < +    Key key = kg.generateKey();
          < +    creds.addSecretKey(alias, key.getEncoded());
          < +  }
          < +
          
          
          Show
          mattpaduano Matthew Paduano added a comment - diff of patch 14,15 218c218 < + readProto(in); --- > + readProtos(in); 333c333 < + public void readProto(DataInput in) throws IOException { --- > + public void readProtos(DataInput in) throws IOException { < public class TestCredentials { ... < + @Test < + public void testBasicReadWriteProtoEmpty() < + throws IOException, NoSuchAlgorithmException { < + String testname = "testBasicReadWriteProtoEmpty" ; < + Credentials ts = new Credentials(); < + writeCredentialsProto(ts, testname); < + Credentials ts2 = readCredentialsProto(testname); < + assertEquals( "test empty tokens" , 0, ts2.numberOfTokens()); < + assertEquals( "test empty keys" , 0, ts2.numberOfSecretKeys()); < + } < + < + @Test < + public void testBasicReadWriteProto() < + throws IOException, NoSuchAlgorithmException { < + String testname = "testBasicReadWriteProto" ; < + Text tok1 = new Text( "token1" ); < + Text tok2 = new Text( "token2" ); < + Text key1 = new Text( "key1" ); < + Credentials ts = generateCredentials(tok1, tok2, key1); < + writeCredentialsProto(ts, testname); < + Credentials ts2 = readCredentialsProto(testname); < + assertCredentials(testname, tok1, key1, ts, ts2); < + assertCredentials(testname, tok2, key1, ts, ts2); < + } < + < + @Test < + public void testBasicReadWriteStreamEmpty() < + throws IOException, NoSuchAlgorithmException { < + String testname = "testBasicReadWriteStreamEmpty" ; < + Credentials ts = new Credentials(); < + writeCredentialsStream(ts, testname); < + Credentials ts2 = readCredentialsStream(testname); < + assertEquals( "test empty tokens" , 0, ts2.numberOfTokens()); < + assertEquals( "test empty keys" , 0, ts2.numberOfSecretKeys()); < + } < + < + @Test < + public void testBasicReadWriteStream() < + throws IOException, NoSuchAlgorithmException { < + String testname = "testBasicReadWriteStream" ; < + Text tok1 = new Text( "token1" ); < + Text tok2 = new Text( "token2" ); < + Text key1 = new Text( "key1" ); < + Credentials ts = generateCredentials(tok1, tok2, key1); < + writeCredentialsStream(ts, testname); < + Credentials ts2 = readCredentialsStream(testname); < + assertCredentials(testname, tok1, key1, ts, ts2); < + assertCredentials(testname, tok2, key1, ts, ts2); < + } < + < + @Test < + /** < + * Verify the suitability of read/writeProto for use with Writable interfac < + * This test uses only empty credentials. < + */ < + public void testWritablePropertiesEmpty() < + throws IOException, NoSuchAlgorithmException { < + String testname = "testWritablePropertiesEmpty" ; < + Credentials ts = new Credentials(); < + Credentials ts2 = new Credentials(); < + writeCredentialsProtos(ts, ts2, testname); < + List<Credentials> clist = readCredentialsProtos(testname); < + assertEquals( "test empty tokens 0" , 0, clist.get(0).numberOfTokens()); < + assertEquals( "test empty keys 0" , 0, clist.get(0).numberOfSecretKeys()); < + assertEquals( "test empty tokens 1" , 0, clist.get(1).numberOfTokens()); < + assertEquals( "test empty keys 1" , 0, clist.get(1).numberOfSecretKeys()); < + } < + < + @Test < + /** < + * Verify the suitability of read/writeProto for use with Writable interfac < + */ < + public void testWritableProperties() < + throws IOException, NoSuchAlgorithmException { < + String testname = "testWritableProperties" ; < + Text tok1 = new Text( "token1" ); < + Text tok2 = new Text( "token2" ); < + Text key1 = new Text( "key1" ); < + Credentials ts = generateCredentials(tok1, tok2, key1); < + Text tok3 = new Text( "token3" ); < + Text key2 = new Text( "key2" ); < + Credentials ts2 = generateCredentials(tok1, tok3, key2); < + writeCredentialsProtos(ts, ts2, testname); < + List<Credentials> clist = readCredentialsProtos(testname); < + assertCredentials(testname, tok1, key1, ts, clist.get(0)); < + assertCredentials(testname, tok2, key1, ts, clist.get(0)); < + assertCredentials(testname, tok1, key2, ts2, clist.get(1)); < + assertCredentials(testname, tok3, key2, ts2, clist.get(1)); < + } < + < + private Credentials generateCredentials(Text t1, Text t2, Text t3) < + throws NoSuchAlgorithmException { < + Text kind = new Text( "TESTTOK" ); < + byte [] id1 = {0x69, 0x64, 0x65, 0x6e, 0x74, 0x69, 0x66, 0x69, 0x65, 0x72} < + byte [] pass1 = {0x70, 0x61, 0x73, 0x73, 0x77, 0x6f, 0x72, 0x64}; < + byte [] id2 = {0x68, 0x63, 0x64, 0x6d, 0x73, 0x68, 0x65, 0x68, 0x64, 0x71} < + byte [] pass2 = {0x6f, 0x60, 0x72, 0x72, 0x76, 0x6e, 0x71, 0x63}; < + Credentials ts = new Credentials(); < + generateToken(ts, id1, pass1, kind, t1); < + generateToken(ts, id2, pass2, kind, t2); < + generateKey(ts, t3); < + return ts; < + } < + < + private void assertCredentials( String tag, Text alias, Text keykey, < + Credentials a, Credentials b) { < + assertEquals(tag + ": test token count" , a.numberOfTokens(), < + b.numberOfTokens()); < + assertEquals(tag + ": test service" , a.getToken(alias).getService(), < + b.getToken(alias).getService()); < + assertEquals(tag + ": test kind" , a.getToken(alias).getKind(), < + b.getToken(alias).getKind()); < + assertTrue(tag + ": test password" , < + Arrays.equals(a.getToken(alias).getPassword(), < + b.getToken(alias).getPassword())); < + assertTrue(tag + ": test identifier" , < + Arrays.equals(a.getToken(alias).getIdentifier(), < + b.getToken(alias).getIdentifier())); < + assertEquals(tag + ": test number of keys" , a.numberOfSecretKeys(), < + b.numberOfSecretKeys()); < + assertTrue(tag + ":test key values" , Arrays.equals(a.getSecretKey(keykey) < + b.getSecretKey(keykey) < + } < + < + private void writeCredentialsStream(Credentials creds, String filename) < + throws IOException, FileNotFoundException { < + DataOutputStream dos = new DataOutputStream( < + new FileOutputStream( new File(tmpDir, filename))); < + creds.writeTokenStorageToStream(dos); < + } < + < + private Credentials readCredentialsStream( String filename) < + throws IOException, FileNotFoundException { < + Credentials creds = new Credentials(); < + DataInputStream dis = new DataInputStream( < + new FileInputStream( new File(tmpDir, filename))); < + creds.readTokenStorageStream(dis); < + return creds; < + } < + < + private void writeCredentialsProto(Credentials creds, String filename) < + throws IOException, FileNotFoundException { < + DataOutputStream dos = new DataOutputStream( < + new FileOutputStream( new File(tmpDir, filename))); < + creds.writeProto(dos); < + } < + < + private Credentials readCredentialsProto( String filename) < + throws IOException, FileNotFoundException { < + Credentials creds = new Credentials(); < + DataInputStream dis = new DataInputStream( < + new FileInputStream( new File(tmpDir, filename))); < + creds.readProto(dis); < + return creds; < + } < + < + private void writeCredentialsProtos(Credentials c1, Credentials c2, < + String filename) throws IOException, FileNotFoundException { < + DataOutputStream dos = new DataOutputStream( < + new FileOutputStream( new File(tmpDir, filename))); < + c1.writeProto(dos); < + c2.writeProto(dos); < + } < + < + private List<Credentials> readCredentialsProtos( String filename) < + throws IOException, FileNotFoundException { < + Credentials c1 = new Credentials(); < + Credentials c2 = new Credentials(); < + DataInputStream dis = new DataInputStream( < + new FileInputStream( new File(tmpDir, filename))); < + c1.readProto(dis); < + c2.readProto(dis); < + List<Credentials> r = new ArrayList<Credentials>(2); < + r.add(0, c1); < + r.add(1, c2); < + return r; < + } < + < + private <T extends TokenIdentifier> void generateToken( < + Credentials creds, byte [] ident, byte [] pass, Text kind, Text service) < + Token<T> token = new Token(ident, pass, kind, service); < + creds.addToken(service, token); < + } < + < + private void generateKey(Credentials creds, Text alias) < + throws NoSuchAlgorithmException { < + final KeyGenerator kg = KeyGenerator.getInstance(DEFAULT_HMAC_ALGORITHM); < + Key key = kg.generateKey(); < + creds.addSecretKey(alias, key.getEncoded()); < + } < +
          Hide
          mattpaduano Matthew Paduano added a comment -

          I am sorry. I cannot help with windows.

          Show
          mattpaduano Matthew Paduano added a comment - I am sorry. I cannot help with windows.
          Hide
          mattpaduano Matthew Paduano added a comment -

          diff from 15,16

          2165c2165
          < +    Mockito.verify(spyCreds).readProtos(in);
          ---
          > +    Mockito.verify(spyCreds).readProto(in);
          

          testing

          mvn test -Dtest=TestCredentials,TestRMContainerAllocator,TestContainerManagerRecovery,TestDtUtilShell,TestCommandShell
          -------------------------------------------------------------------------------
          Test set: org.apache.hadoop.security.TestCredentials
          -------------------------------------------------------------------------------
          Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.961 sec - in org.apache.hadoop.security.TestCredentials
          -------------------------------------------------------------------------------
          Test set: org.apache.hadoop.security.token.TestDtUtilShell
          -------------------------------------------------------------------------------
          Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.778 sec - in org.apache.hadoop.security.token.TestDtUtilShell
          -------------------------------------------------------------------------------
          Test set: org.apache.hadoop.tools.TestCommandShell
          -------------------------------------------------------------------------------
          Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.304 sec - in org.apache.hadoop.tools.TestCommandShell
          -------------------------------------------------------------------------------
          Test set: org.apache.hadoop.yarn.server.nodemanager.containermanager.TestContainerManagerRecovery
          -------------------------------------------------------------------------------
          Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 45.827 sec - in org.apache.hadoop.yarn.server.nodemanager.containermanager.TestContainerManagerRecovery
          -------------------------------------------------------------------------------
          Test set: org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator
          -------------------------------------------------------------------------------
          Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 133.303 sec - in org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator
          
          Show
          mattpaduano Matthew Paduano added a comment - diff from 15,16 2165c2165 < + Mockito.verify(spyCreds).readProtos(in); --- > + Mockito.verify(spyCreds).readProto(in); testing mvn test -Dtest=TestCredentials,TestRMContainerAllocator,TestContainerManagerRecovery,TestDtUtilShell,TestCommandShell ------------------------------------------------------------------------------- Test set: org.apache.hadoop.security.TestCredentials ------------------------------------------------------------------------------- Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.961 sec - in org.apache.hadoop.security.TestCredentials ------------------------------------------------------------------------------- Test set: org.apache.hadoop.security.token.TestDtUtilShell ------------------------------------------------------------------------------- Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.778 sec - in org.apache.hadoop.security.token.TestDtUtilShell ------------------------------------------------------------------------------- Test set: org.apache.hadoop.tools.TestCommandShell ------------------------------------------------------------------------------- Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.304 sec - in org.apache.hadoop.tools.TestCommandShell ------------------------------------------------------------------------------- Test set: org.apache.hadoop.yarn.server.nodemanager.containermanager.TestContainerManagerRecovery ------------------------------------------------------------------------------- Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 45.827 sec - in org.apache.hadoop.yarn.server.nodemanager.containermanager.TestContainerManagerRecovery ------------------------------------------------------------------------------- Test set: org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator ------------------------------------------------------------------------------- Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 133.303 sec - in org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator
          Hide
          raviprak Ravi Prakash added a comment -

          Thanks for the updated patch Matt! Checking in momentarily.

          Show
          raviprak Ravi Prakash added a comment - Thanks for the updated patch Matt! Checking in momentarily.
          Hide
          hudson Hudson added a comment -

          FAILURE: Integrated in Hadoop-trunk-Commit #9698 (See https://builds.apache.org/job/Hadoop-trunk-Commit/9698/)
          HADOOP-12563. Updated utility (dtutil) to create/modify token files. (raviprak: rev 2753185a010e70f8d9539f42151c79177781122d)

          • hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/Credentials.java
          • hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/HdfsDtFetcher.java
          • hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/DtFileOperations.java
          • hadoop-common-project/hadoop-common/src/test/resources/META-INF/services/org.apache.hadoop.security.token.DtFetcher
          • hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/DtFetcher.java
          • hadoop-hdfs-project/hadoop-hdfs/src/main/resources/META-INF/services/org.apache.hadoop.security.token.DtFetcher
          • hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/WebHdfsDtFetcher.java
          • hadoop-common-project/hadoop-common/src/site/markdown/CommandsManual.md
          • hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/SWebHdfsDtFetcher.java
          • hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java
          • hadoop-common-project/hadoop-common/src/main/bin/hadoop
          • hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/DelegationTokenFetcher.java
          • hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/tools/TestCommandShell.java
          • hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/token/TestDtFetcher.java
          • hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/DtUtilShell.java
          • hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/tools/CommandShell.java
          • hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/token/TestDtUtilShell.java
          • hadoop-common-project/hadoop-common/src/main/proto/Security.proto
          • hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestCredentials.java
          Show
          hudson Hudson added a comment - FAILURE: Integrated in Hadoop-trunk-Commit #9698 (See https://builds.apache.org/job/Hadoop-trunk-Commit/9698/ ) HADOOP-12563 . Updated utility (dtutil) to create/modify token files. (raviprak: rev 2753185a010e70f8d9539f42151c79177781122d) hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/Credentials.java hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/HdfsDtFetcher.java hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/DtFileOperations.java hadoop-common-project/hadoop-common/src/test/resources/META-INF/services/org.apache.hadoop.security.token.DtFetcher hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/DtFetcher.java hadoop-hdfs-project/hadoop-hdfs/src/main/resources/META-INF/services/org.apache.hadoop.security.token.DtFetcher hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/WebHdfsDtFetcher.java hadoop-common-project/hadoop-common/src/site/markdown/CommandsManual.md hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/SWebHdfsDtFetcher.java hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/Token.java hadoop-common-project/hadoop-common/src/main/bin/hadoop hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/DelegationTokenFetcher.java hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/tools/TestCommandShell.java hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/token/TestDtFetcher.java hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/token/DtUtilShell.java hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/tools/CommandShell.java hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/token/TestDtUtilShell.java hadoop-common-project/hadoop-common/src/main/proto/Security.proto hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/security/TestCredentials.java
          Hide
          gtCarrera9 Li Lu added a comment -

          I noticed the following errors when running tez on our trunk code:

          2016-05-09 16:12:18,733 [ERROR] [main] |app.DAGAppMaster|: Error starting DAGAppMaster
          java.io.IOException: Exception reading /tmp/hadoop-llu/nm-local-dir/usercache/llu/appcache/application_1462833671675_0001/container_e03_1462833671675_0001_01_000001/container_tokens
                  at org.apache.hadoop.security.Credentials.readTokenStorageFile(Credentials.java:197)
                  at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:789)
                  at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:748)
                  at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:621)
                  at org.apache.tez.dag.app.DAGAppMaster.main(DAGAppMaster.java:2345)
          Caused by: java.io.IOException: Unknown version 1 in token storage.
                  at org.apache.hadoop.security.Credentials.readTokenStorageStream(Credentials.java:215)
                  at org.apache.hadoop.security.Credentials.readTokenStorageFile(Credentials.java:194)
                  ... 4 more
          

          Both Tez and Hadoop are from the latest master/trunk.

          Looks like we introduced some incompatible changes in this JIRA? I marked this JIRA as an incompatible change for now.

          Show
          gtCarrera9 Li Lu added a comment - I noticed the following errors when running tez on our trunk code: 2016-05-09 16:12:18,733 [ERROR] [main] |app.DAGAppMaster|: Error starting DAGAppMaster java.io.IOException: Exception reading /tmp/hadoop-llu/nm-local-dir/usercache/llu/appcache/application_1462833671675_0001/container_e03_1462833671675_0001_01_000001/container_tokens at org.apache.hadoop.security.Credentials.readTokenStorageFile(Credentials.java:197) at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:789) at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:748) at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:621) at org.apache.tez.dag.app.DAGAppMaster.main(DAGAppMaster.java:2345) Caused by: java.io.IOException: Unknown version 1 in token storage. at org.apache.hadoop.security.Credentials.readTokenStorageStream(Credentials.java:215) at org.apache.hadoop.security.Credentials.readTokenStorageFile(Credentials.java:194) ... 4 more Both Tez and Hadoop are from the latest master/trunk. Looks like we introduced some incompatible changes in this JIRA? I marked this JIRA as an incompatible change for now.
          Hide
          mattpaduano Matthew Paduano added a comment -

          that looks like an bona fide IO error to me...

                in = new DataInputStream(new BufferedInputStream(
                    new FileInputStream(filename)));
          
          Show
          mattpaduano Matthew Paduano added a comment - that looks like an bona fide IO error to me... in = new DataInputStream( new BufferedInputStream( new FileInputStream(filename)));
          Hide
          mattpaduano Matthew Paduano added a comment -

          also, FYI, that function was not changed by this patch (the whitespace in the declaration was edited only).
          and it looks from the stack trace like that is the entry point of the code too.

          Show
          mattpaduano Matthew Paduano added a comment - also, FYI, that function was not changed by this patch (the whitespace in the declaration was edited only). and it looks from the stack trace like that is the entry point of the code too.
          Hide
          gtCarrera9 Li Lu added a comment -

          that looks like an bona fide IO error to me...

          Sorry I'm confused, could you please elaborate more on this?

          The exception was generated from readTokenStorageStream, where we check the version number of the token storage. I'm not sure why 1 is an unknown version in the version check...

          Show
          gtCarrera9 Li Lu added a comment - that looks like an bona fide IO error to me... Sorry I'm confused, could you please elaborate more on this? The exception was generated from readTokenStorageStream, where we check the version number of the token storage. I'm not sure why 1 is an unknown version in the version check...
          Hide
          mattpaduano Matthew Paduano added a comment -

          From the line numbers in the lower part of the stack trace, it looks like you are using mixed versions of the Credentials code to manipulate the token files. you are making the file with a new version (version 1) and trying to read it with the unpatched code that doesn't know the new version. I think lines 215 and 194 in your lower stack trace correspond to lines 220 and 199 (which is what they were before this patch was applied).

          I had always considered it a requirement:

          future code can read legacy tokens, but legacy code can not ready future tokens.

          suggestions welcome.

          Show
          mattpaduano Matthew Paduano added a comment - From the line numbers in the lower part of the stack trace, it looks like you are using mixed versions of the Credentials code to manipulate the token files. you are making the file with a new version (version 1) and trying to read it with the unpatched code that doesn't know the new version. I think lines 215 and 194 in your lower stack trace correspond to lines 220 and 199 (which is what they were before this patch was applied). I had always considered it a requirement: future code can read legacy tokens, but legacy code can not ready future tokens. suggestions welcome.
          Hide
          mattpaduano Matthew Paduano added a comment -

          to be clear: lines 220 and 199 in trunk correspond to lines 215 and 194 in pre-this-patch code.

          that is the cause of the error. code written before version 1 was added is being asked to read
          a token file created with version 1.

          note that to fetch a token file and use the legacy format (for just this use case), one might use:

          DtFileOperations.getTokenFile(File tokenFile, String fileFormat,
          Text alias, Text service, String url, String renewer, Configuration conf)

          where fileFormat = DtFileOperations.FORMAT_JAVA

          or use the command line:

          hadoop dtutil get hdfs://hostname:port -format java filename

          Show
          mattpaduano Matthew Paduano added a comment - to be clear: lines 220 and 199 in trunk correspond to lines 215 and 194 in pre-this-patch code. that is the cause of the error. code written before version 1 was added is being asked to read a token file created with version 1. note that to fetch a token file and use the legacy format (for just this use case), one might use: DtFileOperations.getTokenFile(File tokenFile, String fileFormat, Text alias, Text service, String url, String renewer, Configuration conf) where fileFormat = DtFileOperations.FORMAT_JAVA or use the command line: hadoop dtutil get hdfs://hostname:port -format java filename
          Hide
          gtCarrera9 Li Lu added a comment -

          future code can read legacy tokens, but legacy code can not ready future tokens.

          Sure, this is fine for trunk. At the same time it means older clients (yarn 2 clients) may have problems talking to yarn 3 servers, that's why I'm adding an incompatible flag here.

          Show
          gtCarrera9 Li Lu added a comment - future code can read legacy tokens, but legacy code can not ready future tokens. Sure, this is fine for trunk. At the same time it means older clients (yarn 2 clients) may have problems talking to yarn 3 servers, that's why I'm adding an incompatible flag here.
          Hide
          mattpaduano Matthew Paduano added a comment -

          also, if we want to change the code in Credentials... I would consider
          making the default version in the modern code FORMAT_JAVA instead
          of FORMAT_PB. But I was under the impression there was some flaw
          with the legacy format. That is for others to discuss...

          Show
          mattpaduano Matthew Paduano added a comment - also, if we want to change the code in Credentials... I would consider making the default version in the modern code FORMAT_JAVA instead of FORMAT_PB. But I was under the impression there was some flaw with the legacy format. That is for others to discuss...
          Hide
          gtCarrera9 Li Lu added a comment -

          Thanks Matthew Paduano! I'm wondering if we want to have a configuration for default token file version, so that we can still run apps like Tez when they are gradually adopting the new format?

          Show
          gtCarrera9 Li Lu added a comment - Thanks Matthew Paduano ! I'm wondering if we want to have a configuration for default token file version, so that we can still run apps like Tez when they are gradually adopting the new format?
          Hide
          mattpaduano Matthew Paduano added a comment -

          if the client fetches a token and is doing the file writing itself with the old Credentials code,
          it ought to write a version it understands. if it is asking the server to do it, it should ask it
          to use FORMAT_JAVA. I am not sure how it is asking... obviously this is a new change to
          that branch. but that might be a good patch for older branches (if the plan is not to cherry
          pick dtutil altogether (which isn't hard... just need to edit the bin/hadoop bash stuff)).

          Show
          mattpaduano Matthew Paduano added a comment - if the client fetches a token and is doing the file writing itself with the old Credentials code, it ought to write a version it understands. if it is asking the server to do it, it should ask it to use FORMAT_JAVA. I am not sure how it is asking... obviously this is a new change to that branch. but that might be a good patch for older branches (if the plan is not to cherry pick dtutil altogether (which isn't hard... just need to edit the bin/hadoop bash stuff)).
          Hide
          mattpaduano Matthew Paduano added a comment -

          ah. that sounds like a nice proposal. I like that. I can do that.
          hadoop-13123

          Show
          mattpaduano Matthew Paduano added a comment - ah. that sounds like a nice proposal. I like that. I can do that. hadoop-13123
          Hide
          gtCarrera9 Li Lu added a comment -
          Show
          gtCarrera9 Li Lu added a comment - Thanks Matthew Paduano !
          Hide
          aw Allen Wittenauer added a comment -

          I've commented on HADOOP-13123 but this:

          we can still run apps like Tez when they are gradually adopting the new format?

          is basically the wrong approach. Tez should not be reading the file outside of the provided hadoop classes.

          Show
          aw Allen Wittenauer added a comment - I've commented on HADOOP-13123 but this: we can still run apps like Tez when they are gradually adopting the new format? is basically the wrong approach. Tez should not be reading the file outside of the provided hadoop classes.
          Hide
          hitesh Hitesh Shah added a comment -

          Allen Wittenauer Did you take a look at the stack trace in question?

          Caused by: java.io.IOException: Unknown version 1 in token storage.
                  at org.apache.hadoop.security.Credentials.readTokenStorageStream(Credentials.java:*215*)
                  at org.apache.hadoop.security.Credentials.readTokenStorageFile(Credentials.java:*194*)
          

          Could you clarify what you mean by "Tez should not be reading the file outside of the provided hadoop classes" ?

          Show
          hitesh Hitesh Shah added a comment - Allen Wittenauer Did you take a look at the stack trace in question? Caused by: java.io.IOException: Unknown version 1 in token storage. at org.apache.hadoop.security.Credentials.readTokenStorageStream(Credentials.java:*215*) at org.apache.hadoop.security.Credentials.readTokenStorageFile(Credentials.java:*194*) Could you clarify what you mean by "Tez should not be reading the file outside of the provided hadoop classes" ?
          Hide
          mattpaduano Matthew Paduano added a comment -

          I think he is referring to using two incompatible versions of the same class (Credentials).
          One is from trunk, one is from another branch (although simply using an
          older jar file from even the same branch would cause the same error).

          I agree with Allen Wittenauer in this case. in general, one cannot load two arbitrary versions of
          a class and expect data to remain compatible between them. this is an
          issue of mismatched jar files. it is just lucky that it can be alleviated via a simple
          config prop addition.

          Show
          mattpaduano Matthew Paduano added a comment - I think he is referring to using two incompatible versions of the same class (Credentials). One is from trunk, one is from another branch (although simply using an older jar file from even the same branch would cause the same error). I agree with Allen Wittenauer in this case. in general, one cannot load two arbitrary versions of a class and expect data to remain compatible between them. this is an issue of mismatched jar files. it is just lucky that it can be alleviated via a simple config prop addition.
          Hide
          aw Allen Wittenauer added a comment -

          Yup. If this is true:

          Both Tez and Hadoop are from the latest master/trunk.

          .. then that means Tez is including older jars into it's distribution which it should definitely not be doing for the credential code, given:

          @InterfaceAudience.LimitedPrivate({"HDFS", "MapReduce"})

          So yes, Tez is broken here, not Hadoop.

          Show
          aw Allen Wittenauer added a comment - Yup. If this is true: Both Tez and Hadoop are from the latest master/trunk. .. then that means Tez is including older jars into it's distribution which it should definitely not be doing for the credential code, given: @InterfaceAudience.LimitedPrivate({"HDFS", "MapReduce"}) So yes, Tez is broken here, not Hadoop.
          Hide
          stevel@apache.org Steve Loughran added a comment -

          Allen: if something says "MapReduce" then it really means "any yarn app". Too much of hadoop code (even UGI) is tagged as limited private when it has escaped the nest.

          If I'd known this would have broken existing code, I'd have pushed for a better solution. Every incompatible feature that goes into 3.x moves 3.x further away from being something people will want to adopt.

          Show
          stevel@apache.org Steve Loughran added a comment - Allen: if something says "MapReduce" then it really means "any yarn app". Too much of hadoop code (even UGI) is tagged as limited private when it has escaped the nest. If I'd known this would have broken existing code, I'd have pushed for a better solution. Every incompatible feature that goes into 3.x moves 3.x further away from being something people will want to adopt.
          Hide
          aw Allen Wittenauer added a comment -

          Too much of hadoop code (even UGI) is tagged as limited private when it has escaped the nest.

          Then someone should actually do the paperwork to change it. Until then, I'm sticking with whats in the source code. This code is limited to HDFS and MR.

          Every incompatible feature that goes into 3.x moves 3.x further away from being something people will want to adopt.

          ... where people = distribution vendors. The vast majority of folks using the Hadoop stack aren't building it themselves anymore and won't really care if the token file format changed since vendors are on the hook to make sure all of their bits work together.

          Show
          aw Allen Wittenauer added a comment - Too much of hadoop code (even UGI) is tagged as limited private when it has escaped the nest. Then someone should actually do the paperwork to change it. Until then, I'm sticking with whats in the source code. This code is limited to HDFS and MR. Every incompatible feature that goes into 3.x moves 3.x further away from being something people will want to adopt. ... where people = distribution vendors. The vast majority of folks using the Hadoop stack aren't building it themselves anymore and won't really care if the token file format changed since vendors are on the hook to make sure all of their bits work together.
          Hide
          raviprak Ravi Prakash added a comment -

          FWIW! I'm +1 on removing it from branch-2 and labeling it an incompatible fix. Sorry about the breakage folks......

          Show
          raviprak Ravi Prakash added a comment - FWIW! I'm +1 on removing it from branch-2 and labeling it an incompatible fix. Sorry about the breakage folks......
          Hide
          hitesh Hitesh Shah added a comment -

          Ravi Prakash Mind updating the fix versions to denote which version of 2.x this commit went into.

          Show
          hitesh Hitesh Shah added a comment - Ravi Prakash Mind updating the fix versions to denote which version of 2.x this commit went into.
          Hide
          raviprak Ravi Prakash added a comment -

          Haah! I stand corrected. It was never committed to branch-2. The Fix Version is correct

          Show
          raviprak Ravi Prakash added a comment - Haah! I stand corrected. It was never committed to branch-2. The Fix Version is correct
          Hide
          stevel@apache.org Steve Loughran added a comment -

          AW: I do the paperwork to change these things when I see it: HADOOP-12913, HADOOP-11822 ... and as someone who builds downstream code (spark, slider) against branch-2 and branch-3, I'm often the first person complaining that things have broken.

          I don't know what other things HBase, Hive, Flink, etc have picked up; I don't know what expectations they have on the behaviour of bits of the code. All I know is the more changes which break compatibility across versions break my own code. Yes, I try to address these problems before our customers get to see them, but (a) it's a pain, (b) if it changes binary signatures then its a problem for any app designed to build across versions, and (c) semantic changes are the most subtle of all —these are the ones which lurk until production. And you don't want me to add extra works to the ops teams, do you?

          Show
          stevel@apache.org Steve Loughran added a comment - AW: I do the paperwork to change these things when I see it: HADOOP-12913 , HADOOP-11822 ... and as someone who builds downstream code (spark, slider) against branch-2 and branch-3, I'm often the first person complaining that things have broken. I don't know what other things HBase, Hive, Flink, etc have picked up; I don't know what expectations they have on the behaviour of bits of the code. All I know is the more changes which break compatibility across versions break my own code . Yes, I try to address these problems before our customers get to see them, but (a) it's a pain, (b) if it changes binary signatures then its a problem for any app designed to build across versions, and (c) semantic changes are the most subtle of all —these are the ones which lurk until production. And you don't want me to add extra works to the ops teams, do you?

            People

            • Assignee:
              mattpaduano Matthew Paduano
              Reporter:
              aw Allen Wittenauer
            • Votes:
              0 Vote for this issue
              Watchers:
              21 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development