Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-14146

KerberosAuthenticationHandler should authenticate with SPN in AP-REQ

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 2.5.0
    • Fix Version/s: 2.9.0, 3.0.0-alpha4, 2.8.2
    • Component/s: security
    • Labels:
      None
    • Target Version/s:
    • Hadoop Flags:
      Reviewed

      Description

      Many attempts (HADOOP-10158, HADOOP-11628, HADOOP-13565) have tried to add multiple SPN host and/or realm support to spnego authentication. The basic problem is the server tries to guess and/or brute force what SPN the client used. The server should just decode the SPN from the AP-REQ.

      1. HADOOP-14146.1.patch
        28 kB
        Daryn Sharp
      2. HADOOP-14146.2.patch
        27 kB
        Daryn Sharp
      3. HADOOP-14146.3.patch
        27 kB
        Daryn Sharp
      4. HADOOP-14146.addendum.patch
        0.8 kB
        Daryn Sharp
      5. HADOOP-14146.branch-2.test-import.patch
        1 kB
        Daryn Sharp
      6. HADOOP-14146.patch
        28 kB
        Daryn Sharp

        Activity

        Hide
        daryn Daryn Sharp added a comment -

        In earlier jiras, I attempted to support multiple hostnames and cnames via use of the http host header. Then I had to always canonicalize to support cnames. Community added additional support for multiple realms.

        While testing EZ, it was discovered that either the jdk or AuthenticatedURL isn't canonicalizing cnames as expected. Rather than add more hackery, let's just extract the SPN from the AP-REQ. This patch uses a minimal DER parser to do that.

        Removed a bunch of unnecessary code to manage login contexts and spn mappings. JDK7 added a KeyTab instance that can be bound to a specific SPN or unbound for any valid SPN in the keytab. Adding this object to the Subject is sufficient. Combined with SPN extraction, the server can authenticate against any SPN in the keytab (if it's unbound).

        Show
        daryn Daryn Sharp added a comment - In earlier jiras, I attempted to support multiple hostnames and cnames via use of the http host header. Then I had to always canonicalize to support cnames. Community added additional support for multiple realms. While testing EZ, it was discovered that either the jdk or AuthenticatedURL isn't canonicalizing cnames as expected. Rather than add more hackery, let's just extract the SPN from the AP-REQ. This patch uses a minimal DER parser to do that. Removed a bunch of unnecessary code to manage login contexts and spn mappings. JDK7 added a KeyTab instance that can be bound to a specific SPN or unbound for any valid SPN in the keytab. Adding this object to the Subject is sufficient. Combined with SPN extraction, the server can authenticate against any SPN in the keytab (if it's unbound).
        Hide
        hadoopqa Hadoop QA added a comment -
        -1 overall



        Vote Subsystem Runtime Comment
        0 reexec 0m 14s Docker mode activated.
        +1 @author 0m 0s The patch does not contain any @author tags.
        +1 test4tests 0m 0s The patch appears to include 2 new or modified test files.
        +1 mvninstall 13m 0s trunk passed
        -1 compile 10m 31s root in trunk failed.
        +1 checkstyle 0m 17s trunk passed
        +1 mvnsite 0m 23s trunk passed
        +1 mvneclipse 0m 17s trunk passed
        +1 findbugs 0m 28s trunk passed
        +1 javadoc 0m 16s trunk passed
        +1 mvninstall 0m 15s the patch passed
        -1 compile 8m 14s root in the patch failed.
        -1 javac 8m 14s root in the patch failed.
        -0 checkstyle 0m 17s hadoop-common-project/hadoop-auth: The patch generated 3 new + 40 unchanged - 1 fixed = 43 total (was 41)
        +1 mvnsite 0m 22s the patch passed
        +1 mvneclipse 0m 18s the patch passed
        +1 whitespace 0m 0s The patch has no whitespace issues.
        -1 findbugs 0m 38s hadoop-common-project/hadoop-auth generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0)
        +1 javadoc 0m 17s the patch passed
        +1 unit 3m 38s hadoop-auth in the patch passed.
        +1 asflicense 0m 29s The patch does not generate ASF License warnings.
        41m 36s



        Reason Tests
        FindBugs module:hadoop-common-project/hadoop-auth
          org.apache.hadoop.security.authentication.util.KerberosUtil$DER defines equals and uses Object.hashCode() At KerberosUtil.java:Object.hashCode() At KerberosUtil.java:[line 426]



        Subsystem Report/Notes
        Docker Image:yetus/hadoop:a9ad5d6
        JIRA Issue HADOOP-14146
        JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12855990/HADOOP-14146.patch
        Optional Tests asflicense compile javac javadoc mvninstall mvnsite unit findbugs checkstyle
        uname Linux 0ff7a955eb8d 3.13.0-103-generic #150-Ubuntu SMP Thu Nov 24 10:34:17 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux
        Build tool maven
        Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh
        git revision trunk / 8db7a8c
        Default Java 1.8.0_121
        compile https://builds.apache.org/job/PreCommit-HADOOP-Build/11762/artifact/patchprocess/branch-compile-root.txt
        findbugs v3.0.0
        compile https://builds.apache.org/job/PreCommit-HADOOP-Build/11762/artifact/patchprocess/patch-compile-root.txt
        javac https://builds.apache.org/job/PreCommit-HADOOP-Build/11762/artifact/patchprocess/patch-compile-root.txt
        checkstyle https://builds.apache.org/job/PreCommit-HADOOP-Build/11762/artifact/patchprocess/diff-checkstyle-hadoop-common-project_hadoop-auth.txt
        findbugs https://builds.apache.org/job/PreCommit-HADOOP-Build/11762/artifact/patchprocess/new-findbugs-hadoop-common-project_hadoop-auth.html
        Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/11762/testReport/
        modules C: hadoop-common-project/hadoop-auth U: hadoop-common-project/hadoop-auth
        Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/11762/console
        Powered by Apache Yetus 0.5.0-SNAPSHOT http://yetus.apache.org

        This message was automatically generated.

        Show
        hadoopqa Hadoop QA added a comment - -1 overall Vote Subsystem Runtime Comment 0 reexec 0m 14s Docker mode activated. +1 @author 0m 0s The patch does not contain any @author tags. +1 test4tests 0m 0s The patch appears to include 2 new or modified test files. +1 mvninstall 13m 0s trunk passed -1 compile 10m 31s root in trunk failed. +1 checkstyle 0m 17s trunk passed +1 mvnsite 0m 23s trunk passed +1 mvneclipse 0m 17s trunk passed +1 findbugs 0m 28s trunk passed +1 javadoc 0m 16s trunk passed +1 mvninstall 0m 15s the patch passed -1 compile 8m 14s root in the patch failed. -1 javac 8m 14s root in the patch failed. -0 checkstyle 0m 17s hadoop-common-project/hadoop-auth: The patch generated 3 new + 40 unchanged - 1 fixed = 43 total (was 41) +1 mvnsite 0m 22s the patch passed +1 mvneclipse 0m 18s the patch passed +1 whitespace 0m 0s The patch has no whitespace issues. -1 findbugs 0m 38s hadoop-common-project/hadoop-auth generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) +1 javadoc 0m 17s the patch passed +1 unit 3m 38s hadoop-auth in the patch passed. +1 asflicense 0m 29s The patch does not generate ASF License warnings. 41m 36s Reason Tests FindBugs module:hadoop-common-project/hadoop-auth   org.apache.hadoop.security.authentication.util.KerberosUtil$DER defines equals and uses Object.hashCode() At KerberosUtil.java:Object.hashCode() At KerberosUtil.java: [line 426] Subsystem Report/Notes Docker Image:yetus/hadoop:a9ad5d6 JIRA Issue HADOOP-14146 JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12855990/HADOOP-14146.patch Optional Tests asflicense compile javac javadoc mvninstall mvnsite unit findbugs checkstyle uname Linux 0ff7a955eb8d 3.13.0-103-generic #150-Ubuntu SMP Thu Nov 24 10:34:17 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux Build tool maven Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh git revision trunk / 8db7a8c Default Java 1.8.0_121 compile https://builds.apache.org/job/PreCommit-HADOOP-Build/11762/artifact/patchprocess/branch-compile-root.txt findbugs v3.0.0 compile https://builds.apache.org/job/PreCommit-HADOOP-Build/11762/artifact/patchprocess/patch-compile-root.txt javac https://builds.apache.org/job/PreCommit-HADOOP-Build/11762/artifact/patchprocess/patch-compile-root.txt checkstyle https://builds.apache.org/job/PreCommit-HADOOP-Build/11762/artifact/patchprocess/diff-checkstyle-hadoop-common-project_hadoop-auth.txt findbugs https://builds.apache.org/job/PreCommit-HADOOP-Build/11762/artifact/patchprocess/new-findbugs-hadoop-common-project_hadoop-auth.html Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/11762/testReport/ modules C: hadoop-common-project/hadoop-auth U: hadoop-common-project/hadoop-auth Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/11762/console Powered by Apache Yetus 0.5.0-SNAPSHOT http://yetus.apache.org This message was automatically generated.
        Hide
        daryn Daryn Sharp added a comment -

        Fix findbugs and style warnings. Other failures caused by yarn-ui's index.js, not this patch.

        Show
        daryn Daryn Sharp added a comment - Fix findbugs and style warnings. Other failures caused by yarn-ui's index.js, not this patch.
        Hide
        hadoopqa Hadoop QA added a comment -
        +1 overall



        Vote Subsystem Runtime Comment
        0 reexec 0m 16s Docker mode activated.
        +1 @author 0m 0s The patch does not contain any @author tags.
        +1 test4tests 0m 0s The patch appears to include 2 new or modified test files.
        +1 mvninstall 12m 30s trunk passed
        +1 compile 10m 18s trunk passed
        +1 checkstyle 0m 24s trunk passed
        +1 mvnsite 0m 28s trunk passed
        +1 mvneclipse 0m 24s trunk passed
        +1 findbugs 0m 33s trunk passed
        +1 javadoc 0m 24s trunk passed
        +1 mvninstall 0m 15s the patch passed
        +1 compile 9m 55s the patch passed
        +1 javac 9m 55s the patch passed
        +1 checkstyle 0m 25s hadoop-common-project/hadoop-auth: The patch generated 0 new + 40 unchanged - 1 fixed = 40 total (was 41)
        +1 mvnsite 0m 29s the patch passed
        +1 mvneclipse 0m 24s the patch passed
        +1 whitespace 0m 0s The patch has no whitespace issues.
        +1 findbugs 0m 41s the patch passed
        +1 javadoc 0m 24s the patch passed
        +1 unit 3m 37s hadoop-auth in the patch passed.
        +1 asflicense 0m 42s The patch does not generate ASF License warnings.
        44m 22s



        Subsystem Report/Notes
        Docker Image:yetus/hadoop:a9ad5d6
        JIRA Issue HADOOP-14146
        JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12856606/HADOOP-14146.1.patch
        Optional Tests asflicense compile javac javadoc mvninstall mvnsite unit findbugs checkstyle
        uname Linux d1d0d5375cb1 3.13.0-106-generic #153-Ubuntu SMP Tue Dec 6 15:44:32 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux
        Build tool maven
        Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh
        git revision trunk / f597f4c
        Default Java 1.8.0_121
        findbugs v3.0.0
        Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/11769/testReport/
        modules C: hadoop-common-project/hadoop-auth U: hadoop-common-project/hadoop-auth
        Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/11769/console
        Powered by Apache Yetus 0.5.0-SNAPSHOT http://yetus.apache.org

        This message was automatically generated.

        Show
        hadoopqa Hadoop QA added a comment - +1 overall Vote Subsystem Runtime Comment 0 reexec 0m 16s Docker mode activated. +1 @author 0m 0s The patch does not contain any @author tags. +1 test4tests 0m 0s The patch appears to include 2 new or modified test files. +1 mvninstall 12m 30s trunk passed +1 compile 10m 18s trunk passed +1 checkstyle 0m 24s trunk passed +1 mvnsite 0m 28s trunk passed +1 mvneclipse 0m 24s trunk passed +1 findbugs 0m 33s trunk passed +1 javadoc 0m 24s trunk passed +1 mvninstall 0m 15s the patch passed +1 compile 9m 55s the patch passed +1 javac 9m 55s the patch passed +1 checkstyle 0m 25s hadoop-common-project/hadoop-auth: The patch generated 0 new + 40 unchanged - 1 fixed = 40 total (was 41) +1 mvnsite 0m 29s the patch passed +1 mvneclipse 0m 24s the patch passed +1 whitespace 0m 0s The patch has no whitespace issues. +1 findbugs 0m 41s the patch passed +1 javadoc 0m 24s the patch passed +1 unit 3m 37s hadoop-auth in the patch passed. +1 asflicense 0m 42s The patch does not generate ASF License warnings. 44m 22s Subsystem Report/Notes Docker Image:yetus/hadoop:a9ad5d6 JIRA Issue HADOOP-14146 JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12856606/HADOOP-14146.1.patch Optional Tests asflicense compile javac javadoc mvninstall mvnsite unit findbugs checkstyle uname Linux d1d0d5375cb1 3.13.0-106-generic #153-Ubuntu SMP Tue Dec 6 15:44:32 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux Build tool maven Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh git revision trunk / f597f4c Default Java 1.8.0_121 findbugs v3.0.0 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/11769/testReport/ modules C: hadoop-common-project/hadoop-auth U: hadoop-common-project/hadoop-auth Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/11769/console Powered by Apache Yetus 0.5.0-SNAPSHOT http://yetus.apache.org This message was automatically generated.
        Hide
        drankye Kai Zheng added a comment -

        Hi Daryn Sharp,

        Pretty smart and impressive the way you thought of this approach and implemented it. It looks overall good to me.

        One question, maybe we could avoid maintaining the DER parser codes like follows by leveraging the ASN1 and Kerberos library in Apache Kerby? Kerby should be good at such stuffs and the current MiniKDC uses it.

        +  /**
        +   * Extract the TGS server principal from the given gssapi kerberos or spnego
        +   * wrapped token.
        +   * @param rawToken bytes of the gss token
        +   * @return String of server principal
        +   * @throws IllegalArgumentException if token is undecodable
        +   */
        +  public static String getTokenServerName(byte[] rawToken) {
        +    // subsequent comments include only relevant portions of the kerberos
        +    // DER encoding that will be extracted.
        +    DER token = new DER(rawToken);//.get(0x60);
        +    // InitialContextToken ::= [APPLICATION 0] IMPLICIT SEQUENCE {
        +    //     mech   OID
        +    //     mech-token  (NegotiationToken or InnerContextToken)
        +    // }
        +    DER oid = token.next();
        +    if (oid.equals(DER.SPNEGO_MECH_OID)) {
        +      // NegotiationToken ::= CHOICE {
        +      //     neg-token-init[0] NegTokenInit
        +      // }
        +      // NegTokenInit ::= SEQUENCE {
        +      //     mech-token[2]     InitialContextToken
        +      // }
        +      token = token.next().get(0xa0, 0x30, 0xa2, 0x04).next();
        +      oid = token.next();
        +    }
        +    if (!oid.equals(DER.KRB5_MECH_OID)) {
        +      throw new IllegalArgumentException("Malformed gss token");
        +    }
        +    // InnerContextToken ::= {
        +    //     token-id[1]
        +    //     AP-REQ
        +    // }
        +    if (token.next().getTag() != 1) {
        +      throw new IllegalArgumentException("Not an AP-REQ token");
        +    }
        +    // AP-REQ ::= [APPLICATION 14] SEQUENCE {
        +    //     ticket[3]      Ticket
        +    // }
        +    DER ticket = token.next().get(0x6e, 0x30, 0xa3, 0x61, 0x30);
        +    // Ticket ::= [APPLICATION 1] SEQUENCE {
        +    //     realm[1]       String
        +    //     sname[2]       PrincipalName
        +    // }
        +    // PrincipalName ::= SEQUENCE {
        +    //     name-string[1] SEQUENCE OF String
        +    // }
        +    String realm = ticket.get(0xa1, 0x1b).getAsString();
        +    DER names = ticket.get(0xa2, 0x30, 0xa1, 0x30);
        +    StringBuilder sb = new StringBuilder();
        +    while (names.hasNext()) {
        +      if (sb.length() > 0) {
        +        sb.append('/');
        +      }
        +      sb.append(names.next().getAsString());
        +    }
        +    return sb.append('@').append(realm).toString();
        +  }
        

        A comparable sample in Kerby's code base is this test, though it parses AS-REQ (but similar to AP-REQ).

        public class AsReqCodecTest {
        
            @Test
            public void test() throws IOException, ParseException {
                byte[] bytes = CodecTestUtil.readBinaryFile("/asreq.token");
                Asn1.decodeAndDump(bytes);
                ByteBuffer asReqToken = ByteBuffer.wrap(bytes);
        
                AsReq asReq = new AsReq();
                asReq.decode(asReqToken);
                Asn1.dump(asReq);
        
                assertThat(asReq.getPvno()).isEqualTo(5);
                assertThat(asReq.getMsgType()).isEqualTo(KrbMessageType.AS_REQ);
        
                PaData paData = asReq.getPaData();
                PaDataEntry encTimestampEntry = paData.findEntry(PaDataType.ENC_TIMESTAMP);
                assertThat(encTimestampEntry.getPaDataType()).isEqualTo(PaDataType.ENC_TIMESTAMP);
                assertThat(encTimestampEntry.getPaDataValue()).isEqualTo(Arrays.copyOfRange(bytes, 33, 96));
                PaDataEntry pacRequestEntry = paData.findEntry(PaDataType.PAC_REQUEST);
                assertThat(pacRequestEntry.getPaDataType()).isEqualTo(PaDataType.PAC_REQUEST);
                assertThat(pacRequestEntry.getPaDataValue()).isEqualTo(Arrays.copyOfRange(bytes, 108, 115));
        ...
        

        Please let me know if it doesn't work and I can definitely help.

        Show
        drankye Kai Zheng added a comment - Hi Daryn Sharp , Pretty smart and impressive the way you thought of this approach and implemented it. It looks overall good to me. One question, maybe we could avoid maintaining the DER parser codes like follows by leveraging the ASN1 and Kerberos library in Apache Kerby? Kerby should be good at such stuffs and the current MiniKDC uses it. + /** + * Extract the TGS server principal from the given gssapi kerberos or spnego + * wrapped token. + * @param rawToken bytes of the gss token + * @ return String of server principal + * @ throws IllegalArgumentException if token is undecodable + */ + public static String getTokenServerName( byte [] rawToken) { + // subsequent comments include only relevant portions of the kerberos + // DER encoding that will be extracted. + DER token = new DER(rawToken); //.get(0x60); + // InitialContextToken ::= [APPLICATION 0] IMPLICIT SEQUENCE { + // mech OID + // mech-token (NegotiationToken or InnerContextToken) + // } + DER oid = token.next(); + if (oid.equals(DER.SPNEGO_MECH_OID)) { + // NegotiationToken ::= CHOICE { + // neg-token-init[0] NegTokenInit + // } + // NegTokenInit ::= SEQUENCE { + // mech-token[2] InitialContextToken + // } + token = token.next().get(0xa0, 0x30, 0xa2, 0x04).next(); + oid = token.next(); + } + if (!oid.equals(DER.KRB5_MECH_OID)) { + throw new IllegalArgumentException( "Malformed gss token" ); + } + // InnerContextToken ::= { + // token-id[1] + // AP-REQ + // } + if (token.next().getTag() != 1) { + throw new IllegalArgumentException( "Not an AP-REQ token" ); + } + // AP-REQ ::= [APPLICATION 14] SEQUENCE { + // ticket[3] Ticket + // } + DER ticket = token.next().get(0x6e, 0x30, 0xa3, 0x61, 0x30); + // Ticket ::= [APPLICATION 1] SEQUENCE { + // realm[1] String + // sname[2] PrincipalName + // } + // PrincipalName ::= SEQUENCE { + // name-string[1] SEQUENCE OF String + // } + String realm = ticket.get(0xa1, 0x1b).getAsString(); + DER names = ticket.get(0xa2, 0x30, 0xa1, 0x30); + StringBuilder sb = new StringBuilder(); + while (names.hasNext()) { + if (sb.length() > 0) { + sb.append('/'); + } + sb.append(names.next().getAsString()); + } + return sb.append('@').append(realm).toString(); + } A comparable sample in Kerby's code base is this test , though it parses AS-REQ (but similar to AP-REQ). public class AsReqCodecTest { @Test public void test() throws IOException, ParseException { byte [] bytes = CodecTestUtil.readBinaryFile( "/asreq.token" ); Asn1.decodeAndDump(bytes); ByteBuffer asReqToken = ByteBuffer.wrap(bytes); AsReq asReq = new AsReq(); asReq.decode(asReqToken); Asn1.dump(asReq); assertThat(asReq.getPvno()).isEqualTo(5); assertThat(asReq.getMsgType()).isEqualTo(KrbMessageType.AS_REQ); PaData paData = asReq.getPaData(); PaDataEntry encTimestampEntry = paData.findEntry(PaDataType.ENC_TIMESTAMP); assertThat(encTimestampEntry.getPaDataType()).isEqualTo(PaDataType.ENC_TIMESTAMP); assertThat(encTimestampEntry.getPaDataValue()).isEqualTo(Arrays.copyOfRange(bytes, 33, 96)); PaDataEntry pacRequestEntry = paData.findEntry(PaDataType.PAC_REQUEST); assertThat(pacRequestEntry.getPaDataType()).isEqualTo(PaDataType.PAC_REQUEST); assertThat(pacRequestEntry.getPaDataValue()).isEqualTo(Arrays.copyOfRange(bytes, 108, 115)); ... Please let me know if it doesn't work and I can definitely help.
        Hide
        daryn Daryn Sharp added a comment -

        One question, maybe we could avoid maintaining the DER parser codes like follows by leveraging the ASN1 and Kerberos library in Apache Kerby?

        I had quickly looked at Kerby and Apache DS. I didn't use them for a few reasons: The extra dependency burden. Relatively heavy weight to partially decode the ticket - we need just a tiny portion of it. The encoding format is decades old so there shouldn't be a maintainability issue.

        Thoughts?

        Show
        daryn Daryn Sharp added a comment - One question, maybe we could avoid maintaining the DER parser codes like follows by leveraging the ASN1 and Kerberos library in Apache Kerby? I had quickly looked at Kerby and Apache DS. I didn't use them for a few reasons: The extra dependency burden. Relatively heavy weight to partially decode the ticket - we need just a tiny portion of it. The encoding format is decades old so there shouldn't be a maintainability issue. Thoughts?
        Hide
        drankye Kai Zheng added a comment -

        I had quickly looked at Kerby and Apache DS. I didn't use them for a few reasons: The extra dependency burden.

        In case it's needed, Apache DS as LDAP involved the dependency is heavy, but this work can be done using Kerby and only org.apache.kerby.kerb-core module is needed, which is pretty lightweight.

        The encoding format is decades old so there shouldn't be a maintainability issue.

        Anyway to decode the kerberos ticket, we need to test and verify it works for the main Kerberos KDC products. It looks good to provide some tests already, I guess the packets are AD or MIT issued ones?

        Show
        drankye Kai Zheng added a comment - I had quickly looked at Kerby and Apache DS. I didn't use them for a few reasons: The extra dependency burden. In case it's needed, Apache DS as LDAP involved the dependency is heavy, but this work can be done using Kerby and only org.apache.kerby.kerb-core module is needed, which is pretty lightweight. The encoding format is decades old so there shouldn't be a maintainability issue. Anyway to decode the kerberos ticket, we need to test and verify it works for the main Kerberos KDC products. It looks good to provide some tests already, I guess the packets are AD or MIT issued ones?
        Hide
        daryn Daryn Sharp added a comment -

        Based on your suggestion, I looked at the kerby code again. It's much more expensive in both computation and object allocation rates, the latter of which we definitely don't need. My goal is an extremely lightweight and minimal decode since the gssmanager is subsequently going to do a full decode.

        I did testing with AD and the unit tests use mini-kdc issued tickets. I wouldn't be too worried about kdcs though. Service tickets are an ancient and well-defined RFC format. The JDK very rigidly follows it and makes assumptions of DER tag ordering (it'll incidentally blow up if it assumed wrong), whereas I'm being more correct in looking up & verifying DER tags.

        Show
        daryn Daryn Sharp added a comment - Based on your suggestion, I looked at the kerby code again. It's much more expensive in both computation and object allocation rates, the latter of which we definitely don't need. My goal is an extremely lightweight and minimal decode since the gssmanager is subsequently going to do a full decode. I did testing with AD and the unit tests use mini-kdc issued tickets. I wouldn't be too worried about kdcs though. Service tickets are an ancient and well-defined RFC format. The JDK very rigidly follows it and makes assumptions of DER tag ordering (it'll incidentally blow up if it assumed wrong), whereas I'm being more correct in looking up & verifying DER tags.
        Hide
        daryn Daryn Sharp added a comment -

        Kai Zheng, this is an internal blocker and would really like to not internally maintain it. Any objections to the actual implementation?

        Show
        daryn Daryn Sharp added a comment - Kai Zheng , this is an internal blocker and would really like to not internally maintain it. Any objections to the actual implementation?
        Hide
        shahrs87 Rushabh S Shah added a comment -

        KeyTab.getUnboundInstance is not supported in java 7.
        Its a newly added api in java 8.
        I noticed that the patch applies without any conflicts to branch-2.8 also but would be nice if you can create branch-2/branch-2.8 patch so that jenkins can build with java 7 and java8.

        Show
        shahrs87 Rushabh S Shah added a comment - KeyTab.getUnboundInstance is not supported in java 7. Its a newly added api in java 8. I noticed that the patch applies without any conflicts to branch-2.8 also but would be nice if you can create branch-2/branch-2.8 patch so that jenkins can build with java 7 and java8.
        Hide
        drankye Kai Zheng added a comment -

        Daryn sorry for the late response. I'm OK with the current approach to proceed, though I'd take some time later to explore more and improve it when sounds good.

        Show
        drankye Kai Zheng added a comment - Daryn sorry for the late response. I'm OK with the current approach to proceed, though I'd take some time later to explore more and improve it when sounds good.
        Hide
        daryn Daryn Sharp added a comment -

        Kai Zheng, thanks for the reviews! Don't mean to be pedantic but could you please provide an explicit +1? BTW, we've had great success with this patch internally.

        Show
        daryn Daryn Sharp added a comment - Kai Zheng , thanks for the reviews! Don't mean to be pedantic but could you please provide an explicit +1? BTW, we've had great success with this patch internally.
        Hide
        drankye Kai Zheng added a comment -

        I'm in travelling and looking for the time for a full detailed review. Should happen in a few days. Meanwhile, could you address Rushabh S Shah's above comment? Thanks!

        Show
        drankye Kai Zheng added a comment - I'm in travelling and looking for the time for a full detailed review. Should happen in a few days. Meanwhile, could you address Rushabh S Shah 's above comment? Thanks!
        Hide
        daryn Daryn Sharp added a comment -

        Removed use of jdk8-specific keytab method.

        Show
        daryn Daryn Sharp added a comment - Removed use of jdk8-specific keytab method.
        Hide
        hadoopqa Hadoop QA added a comment -
        -1 overall



        Vote Subsystem Runtime Comment
        0 reexec 0m 16s Docker mode activated.
        +1 @author 0m 0s The patch does not contain any @author tags.
        +1 test4tests 0m 0s The patch appears to include 2 new or modified test files.
        +1 mvninstall 12m 40s trunk passed
        +1 compile 13m 9s trunk passed
        +1 checkstyle 0m 21s trunk passed
        +1 mvnsite 0m 28s trunk passed
        -1 findbugs 0m 31s hadoop-common-project/hadoop-auth in trunk has 1 extant Findbugs warnings.
        +1 javadoc 0m 21s trunk passed
        +1 mvninstall 0m 16s the patch passed
        +1 compile 9m 39s the patch passed
        +1 javac 9m 39s the patch passed
        +1 checkstyle 0m 21s hadoop-common-project/hadoop-auth: The patch generated 0 new + 40 unchanged - 1 fixed = 40 total (was 41)
        +1 mvnsite 0m 28s the patch passed
        +1 whitespace 0m 0s The patch has no whitespace issues.
        +1 findbugs 0m 37s the patch passed
        +1 javadoc 0m 22s the patch passed
        +1 unit 2m 50s hadoop-auth in the patch passed.
        +1 asflicense 0m 34s The patch does not generate ASF License warnings.
        44m 54s



        Subsystem Report/Notes
        Docker Image:yetus/hadoop:14b5c93
        JIRA Issue HADOOP-14146
        JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12871248/HADOOP-14146.2.patch
        Optional Tests asflicense compile javac javadoc mvninstall mvnsite unit findbugs checkstyle
        uname Linux 6cd3396d1de5 4.4.0-43-generic #63-Ubuntu SMP Wed Oct 12 13:48:03 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux
        Build tool maven
        Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh
        git revision trunk / 46f7e91
        Default Java 1.8.0_131
        findbugs v3.1.0-RC1
        findbugs https://builds.apache.org/job/PreCommit-HADOOP-Build/12442/artifact/patchprocess/branch-findbugs-hadoop-common-project_hadoop-auth-warnings.html
        Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/12442/testReport/
        modules C: hadoop-common-project/hadoop-auth U: hadoop-common-project/hadoop-auth
        Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/12442/console
        Powered by Apache Yetus 0.5.0-SNAPSHOT http://yetus.apache.org

        This message was automatically generated.

        Show
        hadoopqa Hadoop QA added a comment - -1 overall Vote Subsystem Runtime Comment 0 reexec 0m 16s Docker mode activated. +1 @author 0m 0s The patch does not contain any @author tags. +1 test4tests 0m 0s The patch appears to include 2 new or modified test files. +1 mvninstall 12m 40s trunk passed +1 compile 13m 9s trunk passed +1 checkstyle 0m 21s trunk passed +1 mvnsite 0m 28s trunk passed -1 findbugs 0m 31s hadoop-common-project/hadoop-auth in trunk has 1 extant Findbugs warnings. +1 javadoc 0m 21s trunk passed +1 mvninstall 0m 16s the patch passed +1 compile 9m 39s the patch passed +1 javac 9m 39s the patch passed +1 checkstyle 0m 21s hadoop-common-project/hadoop-auth: The patch generated 0 new + 40 unchanged - 1 fixed = 40 total (was 41) +1 mvnsite 0m 28s the patch passed +1 whitespace 0m 0s The patch has no whitespace issues. +1 findbugs 0m 37s the patch passed +1 javadoc 0m 22s the patch passed +1 unit 2m 50s hadoop-auth in the patch passed. +1 asflicense 0m 34s The patch does not generate ASF License warnings. 44m 54s Subsystem Report/Notes Docker Image:yetus/hadoop:14b5c93 JIRA Issue HADOOP-14146 JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12871248/HADOOP-14146.2.patch Optional Tests asflicense compile javac javadoc mvninstall mvnsite unit findbugs checkstyle uname Linux 6cd3396d1de5 4.4.0-43-generic #63-Ubuntu SMP Wed Oct 12 13:48:03 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux Build tool maven Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh git revision trunk / 46f7e91 Default Java 1.8.0_131 findbugs v3.1.0-RC1 findbugs https://builds.apache.org/job/PreCommit-HADOOP-Build/12442/artifact/patchprocess/branch-findbugs-hadoop-common-project_hadoop-auth-warnings.html Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/12442/testReport/ modules C: hadoop-common-project/hadoop-auth U: hadoop-common-project/hadoop-auth Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/12442/console Powered by Apache Yetus 0.5.0-SNAPSHOT http://yetus.apache.org This message was automatically generated.
        Hide
        daryn Daryn Sharp added a comment -

        Kai Zheng, have you had a chance to review?

        Show
        daryn Daryn Sharp added a comment - Kai Zheng , have you had a chance to review?
        Hide
        drankye Kai Zheng added a comment -

        In addition to above comments, some more:

        1. Ref. below, NT_GSS_KRB5_PRINCIPAL could be NT_GSS_KRB5_PRINCIPAL_OID.

        +  public static final Oid GSS_SPNEGO_MECH_OID =
        +      getNumericOidInstance("1.3.6.1.5.5.2");
        +  public static final Oid GSS_KRB5_MECH_OID =
        +      getNumericOidInstance("1.2.840.113554.1.2.2");
        +  public static final Oid NT_GSS_KRB5_PRINCIPAL =
        +      getNumericOidInstance("1.2.840.113554.1.2.2.1");
        

        2. Ref. below, the message would be more specific like "Invalid server principal {} decoded from client request".

        +        final String serverPrincipal =
        +            KerberosUtil.getTokenServerName(clientToken);
        +        if (!serverPrincipal.startsWith("HTTP/")) {
        +          throw new IllegalArgumentException(
        +              "Invalid server principal: " + serverPrincipal);
        +        }
        

        3. You get rid of the login check for each HTTP server principal listed in the keytab, instead, you put them into the server subject directly or manually. Is it possible a server principal expired or invalid at the time?

        -      for (String spnegoPrincipal : spnegoPrincipals) {
        -        LOG.info("Login using keytab {}, for principal {}",
        -            keytab, spnegoPrincipal);
        -        final KerberosConfiguration kerberosConfiguration =
        -            new KerberosConfiguration(keytab, spnegoPrincipal);
        -        final LoginContext loginContext =
        -            new LoginContext("", serverSubject, null, kerberosConfiguration);
        -        try {
        -          loginContext.login();
        -        } catch (LoginException le) {
        -          LOG.warn("Failed to login as [{}]", spnegoPrincipal, le);
        -          throw new AuthenticationException(le);          
        -        }
        -        loginContexts.add(loginContext);
        

        4. Besides you might want to call KerberosUtil.hasKerberosKeyTab with the placed keytab instance in the subject, wonder how the instance would be used in the subsequent SPNEGO authenticating to the client token. Could you help explain some bit for me or as comment for the code? Thanks!

        +      KeyTab keytabInstance = KeyTab.getInstance(keytabFile);
        +      serverSubject.getPrivateCredentials().add(keytabInstance);
        

        5. Is is a good chance to move the follow block to somewhere like KerberosUtil?

            /* Return the OS login module class name */
            private static String getOSLoginModuleName() {
              if (IBM_JAVA) {
                if (windows) {
                  return is64Bit ? "com.ibm.security.auth.module.Win64LoginModule"
                      : "com.ibm.security.auth.module.NTLoginModule";
                } else if (aix) {
                  return is64Bit ? "com.ibm.security.auth.module.AIX64LoginModule"
                      : "com.ibm.security.auth.module.AIXLoginModule";
                } else {
                  return "com.ibm.security.auth.module.LinuxLoginModule";
                }
              } else {
                return windows ? "com.sun.security.auth.module.NTLoginModule"
                    : "com.sun.security.auth.module.UnixLoginModule";
              }
            }
        
        Show
        drankye Kai Zheng added a comment - In addition to above comments, some more: 1. Ref. below, NT_GSS_KRB5_PRINCIPAL could be NT_GSS_KRB5_PRINCIPAL_OID. + public static final Oid GSS_SPNEGO_MECH_OID = + getNumericOidInstance( "1.3.6.1.5.5.2" ); + public static final Oid GSS_KRB5_MECH_OID = + getNumericOidInstance( "1.2.840.113554.1.2.2" ); + public static final Oid NT_GSS_KRB5_PRINCIPAL = + getNumericOidInstance( "1.2.840.113554.1.2.2.1" ); 2. Ref. below, the message would be more specific like "Invalid server principal {} decoded from client request". + final String serverPrincipal = + KerberosUtil.getTokenServerName(clientToken); + if (!serverPrincipal.startsWith( "HTTP/" )) { + throw new IllegalArgumentException( + "Invalid server principal: " + serverPrincipal); + } 3. You get rid of the login check for each HTTP server principal listed in the keytab, instead, you put them into the server subject directly or manually. Is it possible a server principal expired or invalid at the time? - for ( String spnegoPrincipal : spnegoPrincipals) { - LOG.info( "Login using keytab {}, for principal {}" , - keytab, spnegoPrincipal); - final KerberosConfiguration kerberosConfiguration = - new KerberosConfiguration(keytab, spnegoPrincipal); - final LoginContext loginContext = - new LoginContext("", serverSubject, null , kerberosConfiguration); - try { - loginContext.login(); - } catch (LoginException le) { - LOG.warn( "Failed to login as [{}]" , spnegoPrincipal, le); - throw new AuthenticationException(le); - } - loginContexts.add(loginContext); 4. Besides you might want to call KerberosUtil.hasKerberosKeyTab with the placed keytab instance in the subject, wonder how the instance would be used in the subsequent SPNEGO authenticating to the client token. Could you help explain some bit for me or as comment for the code? Thanks! + KeyTab keytabInstance = KeyTab.getInstance(keytabFile); + serverSubject.getPrivateCredentials().add(keytabInstance); 5. Is is a good chance to move the follow block to somewhere like KerberosUtil ? /* Return the OS login module class name */ private static String getOSLoginModuleName() { if (IBM_JAVA) { if (windows) { return is64Bit ? "com.ibm.security.auth.module.Win64LoginModule" : "com.ibm.security.auth.module.NTLoginModule" ; } else if (aix) { return is64Bit ? "com.ibm.security.auth.module.AIX64LoginModule" : "com.ibm.security.auth.module.AIXLoginModule" ; } else { return "com.ibm.security.auth.module.LinuxLoginModule" ; } } else { return windows ? "com.sun.security.auth.module.NTLoginModule" : "com.sun.security.auth.module.UnixLoginModule" ; } }
        Hide
        daryn Daryn Sharp added a comment -

        Kai Zheng, thanks for your time!

        1. Added OID suffix.
        2. Changed exception message.
        3. The login module's main purpose is obtaining a TGT which initiators/clients require. Acceptor services only need the secrets in the keytab. The Keytab instance automatically loads the secrets.
        4. I'm not sure I understand the first part.
          • There's no need to pre-check for a keytab in the subject because the authenticator created an empty subject.
          • Java's GSS implementations used in SASL and SPNEGO automatically look for a keytab instance.
        5. While perhaps a good idea, it's unrelated to this patch and more suitable for another jira if someone wants to do it.
        Show
        daryn Daryn Sharp added a comment - Kai Zheng , thanks for your time! Added OID suffix. Changed exception message. The login module's main purpose is obtaining a TGT which initiators/clients require. Acceptor services only need the secrets in the keytab. The Keytab instance automatically loads the secrets. I'm not sure I understand the first part. There's no need to pre-check for a keytab in the subject because the authenticator created an empty subject. Java's GSS implementations used in SASL and SPNEGO automatically look for a keytab instance. While perhaps a good idea, it's unrelated to this patch and more suitable for another jira if someone wants to do it.
        Hide
        hadoopqa Hadoop QA added a comment -
        -1 overall



        Vote Subsystem Runtime Comment
        0 reexec 0m 17s Docker mode activated.
        +1 @author 0m 0s The patch does not contain any @author tags.
        +1 test4tests 0m 0s The patch appears to include 2 new or modified test files.
        +1 mvninstall 14m 42s trunk passed
        +1 compile 16m 29s trunk passed
        +1 checkstyle 0m 18s trunk passed
        +1 mvnsite 0m 25s trunk passed
        -1 findbugs 0m 33s hadoop-common-project/hadoop-auth in trunk has 1 extant Findbugs warnings.
        +1 javadoc 0m 21s trunk passed
        +1 mvninstall 0m 20s the patch passed
        +1 compile 12m 6s the patch passed
        +1 javac 12m 6s the patch passed
        +1 checkstyle 0m 20s hadoop-common-project/hadoop-auth: The patch generated 0 new + 40 unchanged - 1 fixed = 40 total (was 41)
        +1 mvnsite 0m 29s the patch passed
        +1 whitespace 0m 0s The patch has no whitespace issues.
        +1 findbugs 0m 45s the patch passed
        +1 javadoc 0m 18s the patch passed
        +1 unit 2m 48s hadoop-auth in the patch passed.
        +1 asflicense 0m 37s The patch does not generate ASF License warnings.
        52m 39s



        Subsystem Report/Notes
        Docker Image:yetus/hadoop:14b5c93
        JIRA Issue HADOOP-14146
        JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12872096/HADOOP-14146.3.patch
        Optional Tests asflicense compile javac javadoc mvninstall mvnsite unit findbugs checkstyle
        uname Linux 39a19f4ddf52 3.13.0-116-generic #163-Ubuntu SMP Fri Mar 31 14:13:22 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
        Build tool maven
        Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh
        git revision trunk / 5672ae7
        Default Java 1.8.0_131
        findbugs v3.1.0-RC1
        findbugs https://builds.apache.org/job/PreCommit-HADOOP-Build/12483/artifact/patchprocess/branch-findbugs-hadoop-common-project_hadoop-auth-warnings.html
        Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/12483/testReport/
        modules C: hadoop-common-project/hadoop-auth U: hadoop-common-project/hadoop-auth
        Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/12483/console
        Powered by Apache Yetus 0.5.0-SNAPSHOT http://yetus.apache.org

        This message was automatically generated.

        Show
        hadoopqa Hadoop QA added a comment - -1 overall Vote Subsystem Runtime Comment 0 reexec 0m 17s Docker mode activated. +1 @author 0m 0s The patch does not contain any @author tags. +1 test4tests 0m 0s The patch appears to include 2 new or modified test files. +1 mvninstall 14m 42s trunk passed +1 compile 16m 29s trunk passed +1 checkstyle 0m 18s trunk passed +1 mvnsite 0m 25s trunk passed -1 findbugs 0m 33s hadoop-common-project/hadoop-auth in trunk has 1 extant Findbugs warnings. +1 javadoc 0m 21s trunk passed +1 mvninstall 0m 20s the patch passed +1 compile 12m 6s the patch passed +1 javac 12m 6s the patch passed +1 checkstyle 0m 20s hadoop-common-project/hadoop-auth: The patch generated 0 new + 40 unchanged - 1 fixed = 40 total (was 41) +1 mvnsite 0m 29s the patch passed +1 whitespace 0m 0s The patch has no whitespace issues. +1 findbugs 0m 45s the patch passed +1 javadoc 0m 18s the patch passed +1 unit 2m 48s hadoop-auth in the patch passed. +1 asflicense 0m 37s The patch does not generate ASF License warnings. 52m 39s Subsystem Report/Notes Docker Image:yetus/hadoop:14b5c93 JIRA Issue HADOOP-14146 JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12872096/HADOOP-14146.3.patch Optional Tests asflicense compile javac javadoc mvninstall mvnsite unit findbugs checkstyle uname Linux 39a19f4ddf52 3.13.0-116-generic #163-Ubuntu SMP Fri Mar 31 14:13:22 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux Build tool maven Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh git revision trunk / 5672ae7 Default Java 1.8.0_131 findbugs v3.1.0-RC1 findbugs https://builds.apache.org/job/PreCommit-HADOOP-Build/12483/artifact/patchprocess/branch-findbugs-hadoop-common-project_hadoop-auth-warnings.html Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/12483/testReport/ modules C: hadoop-common-project/hadoop-auth U: hadoop-common-project/hadoop-auth Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/12483/console Powered by Apache Yetus 0.5.0-SNAPSHOT http://yetus.apache.org This message was automatically generated.
        Hide
        drankye Kai Zheng added a comment -

        Thanks Daryn Sharp for the nice update!

        I still have some concern over the large portion of ASN.1 & DER decoding codes because it'll be a maintain burden for the project. However that utility is quite separate and we can refine the part later. With the added good tests, and based on my trust of you in the domain, I'm good to provide my +1.

        You might wait a couple of days and then commit it if no other comments. Thanks!

        Show
        drankye Kai Zheng added a comment - Thanks Daryn Sharp for the nice update! I still have some concern over the large portion of ASN.1 & DER decoding codes because it'll be a maintain burden for the project. However that utility is quite separate and we can refine the part later. With the added good tests, and based on my trust of you in the domain, I'm good to provide my +1. You might wait a couple of days and then commit it if no other comments. Thanks!
        Hide
        daryn Daryn Sharp added a comment -

        I don't think there's much worry of a completely incompatible change occurring. ASN.1 was defined in 1984. The Kerberos v5 DER was defined in 1993.

        Thanks!

        Show
        daryn Daryn Sharp added a comment - I don't think there's much worry of a completely incompatible change occurring. ASN.1 was defined in 1984. The Kerberos v5 DER was defined in 1993. Thanks!
        Hide
        daryn Daryn Sharp added a comment -

        Committed to trunk, branch 2 & 2.8.

        Show
        daryn Daryn Sharp added a comment - Committed to trunk, branch 2 & 2.8.
        Hide
        hudson Hudson added a comment -

        SUCCESS: Integrated in Jenkins build Hadoop-trunk-Commit #11902 (See https://builds.apache.org/job/Hadoop-trunk-Commit/11902/)
        HADOOP-14146. KerberosAuthenticationHandler should authenticate with (daryn: rev e806c6e0ce6026d53227b51d58ec6d5458164571)

        • (edit) hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/util/TestKerberosUtil.java
        • (edit) hadoop-common-project/hadoop-auth/src/main/java/org/apache/hadoop/security/authentication/util/KerberosUtil.java
        • (edit) hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/server/TestMultiSchemeAuthenticationHandler.java
        • (edit) hadoop-common-project/hadoop-auth/src/main/java/org/apache/hadoop/security/authentication/client/KerberosAuthenticator.java
        • (edit) hadoop-common-project/hadoop-auth/src/main/java/org/apache/hadoop/security/authentication/server/KerberosAuthenticationHandler.java
        Show
        hudson Hudson added a comment - SUCCESS: Integrated in Jenkins build Hadoop-trunk-Commit #11902 (See https://builds.apache.org/job/Hadoop-trunk-Commit/11902/ ) HADOOP-14146 . KerberosAuthenticationHandler should authenticate with (daryn: rev e806c6e0ce6026d53227b51d58ec6d5458164571) (edit) hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/util/TestKerberosUtil.java (edit) hadoop-common-project/hadoop-auth/src/main/java/org/apache/hadoop/security/authentication/util/KerberosUtil.java (edit) hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/server/TestMultiSchemeAuthenticationHandler.java (edit) hadoop-common-project/hadoop-auth/src/main/java/org/apache/hadoop/security/authentication/client/KerberosAuthenticator.java (edit) hadoop-common-project/hadoop-auth/src/main/java/org/apache/hadoop/security/authentication/server/KerberosAuthenticationHandler.java
        Hide
        ajisakaa Akira Ajisaka added a comment -

        This commit broke build in JDK7. Would you fix this in branch-2 and branch-2.8?

        [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hadoop-auth: Compilation failure
        [ERROR] /Users/ajisaka/git/hadoop/hadoop-common-project/hadoop-auth/src/main/java/org/apache/hadoop/security/authentication/util/KerberosUtil.java:[341,18] org.apache.hadoop.security.authentication.util.KerberosUtil.DER is not abstract and does not override abstract method remove() in java.util.Iterator
        
        Show
        ajisakaa Akira Ajisaka added a comment - This commit broke build in JDK7. Would you fix this in branch-2 and branch-2.8? [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hadoop-auth: Compilation failure [ERROR] /Users/ajisaka/git/hadoop/hadoop-common-project/hadoop-auth/src/main/java/org/apache/hadoop/security/authentication/util/KerberosUtil.java:[341,18] org.apache.hadoop.security.authentication.util.KerberosUtil.DER is not abstract and does not override abstract method remove() in java.util.Iterator
        Hide
        daryn Daryn Sharp added a comment -

        Failing with java 7 due to java 8's Iterator having a default impl of remove. Will post addendum shortly.

        Show
        daryn Daryn Sharp added a comment - Failing with java 7 due to java 8's Iterator having a default impl of remove. Will post addendum shortly.
        Hide
        daryn Daryn Sharp added a comment -

        Simply add impl for remove(). Patch applies to trunk, branch-2, branch-2.8.

        Show
        daryn Daryn Sharp added a comment - Simply add impl for remove(). Patch applies to trunk, branch-2, branch-2.8.
        Hide
        hadoopqa Hadoop QA added a comment -
        -1 overall



        Vote Subsystem Runtime Comment
        0 reexec 0m 19s Docker mode activated.
        +1 @author 0m 0s The patch does not contain any @author tags.
        -1 test4tests 0m 0s The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch.
        +1 mvninstall 13m 44s trunk passed
        +1 compile 14m 23s trunk passed
        +1 checkstyle 0m 19s trunk passed
        +1 mvnsite 0m 26s trunk passed
        +1 findbugs 0m 30s trunk passed
        +1 javadoc 0m 19s trunk passed
        +1 mvninstall 0m 16s the patch passed
        +1 compile 10m 37s the patch passed
        +1 javac 10m 37s the patch passed
        +1 checkstyle 0m 19s the patch passed
        +1 mvnsite 0m 26s the patch passed
        +1 whitespace 0m 0s The patch has no whitespace issues.
        +1 findbugs 0m 37s the patch passed
        +1 javadoc 0m 20s the patch passed
        +1 unit 2m 51s hadoop-auth in the patch passed.
        +1 asflicense 0m 33s The patch does not generate ASF License warnings.
        47m 49s



        Subsystem Report/Notes
        Docker Image:yetus/hadoop:14b5c93
        JIRA Issue HADOOP-14146
        JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12874258/HADOOP-14146.addendum.patch
        Optional Tests asflicense compile javac javadoc mvninstall mvnsite unit findbugs checkstyle
        uname Linux e2b3edbc79e9 3.13.0-116-generic #163-Ubuntu SMP Fri Mar 31 14:13:22 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
        Build tool maven
        Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh
        git revision trunk / ca13b22
        Default Java 1.8.0_131
        findbugs v3.1.0-RC1
        Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/12616/testReport/
        modules C: hadoop-common-project/hadoop-auth U: hadoop-common-project/hadoop-auth
        Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/12616/console
        Powered by Apache Yetus 0.5.0-SNAPSHOT http://yetus.apache.org

        This message was automatically generated.

        Show
        hadoopqa Hadoop QA added a comment - -1 overall Vote Subsystem Runtime Comment 0 reexec 0m 19s Docker mode activated. +1 @author 0m 0s The patch does not contain any @author tags. -1 test4tests 0m 0s The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. +1 mvninstall 13m 44s trunk passed +1 compile 14m 23s trunk passed +1 checkstyle 0m 19s trunk passed +1 mvnsite 0m 26s trunk passed +1 findbugs 0m 30s trunk passed +1 javadoc 0m 19s trunk passed +1 mvninstall 0m 16s the patch passed +1 compile 10m 37s the patch passed +1 javac 10m 37s the patch passed +1 checkstyle 0m 19s the patch passed +1 mvnsite 0m 26s the patch passed +1 whitespace 0m 0s The patch has no whitespace issues. +1 findbugs 0m 37s the patch passed +1 javadoc 0m 20s the patch passed +1 unit 2m 51s hadoop-auth in the patch passed. +1 asflicense 0m 33s The patch does not generate ASF License warnings. 47m 49s Subsystem Report/Notes Docker Image:yetus/hadoop:14b5c93 JIRA Issue HADOOP-14146 JIRA Patch URL https://issues.apache.org/jira/secure/attachment/12874258/HADOOP-14146.addendum.patch Optional Tests asflicense compile javac javadoc mvninstall mvnsite unit findbugs checkstyle uname Linux e2b3edbc79e9 3.13.0-116-generic #163-Ubuntu SMP Fri Mar 31 14:13:22 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux Build tool maven Personality /testptch/hadoop/patchprocess/precommit/personality/provided.sh git revision trunk / ca13b22 Default Java 1.8.0_131 findbugs v3.1.0-RC1 Test Results https://builds.apache.org/job/PreCommit-HADOOP-Build/12616/testReport/ modules C: hadoop-common-project/hadoop-auth U: hadoop-common-project/hadoop-auth Console output https://builds.apache.org/job/PreCommit-HADOOP-Build/12616/console Powered by Apache Yetus 0.5.0-SNAPSHOT http://yetus.apache.org This message was automatically generated.
        Hide
        kihwal Kihwal Lee added a comment -

        +1 for the addendum. Will commit it shortly.

        Show
        kihwal Kihwal Lee added a comment - +1 for the addendum. Will commit it shortly.
        Hide
        kihwal Kihwal Lee added a comment -

        That wasn't enough. Missing Base64 symbol in tests.

        Show
        kihwal Kihwal Lee added a comment - That wasn't enough. Missing Base64 symbol in tests.
        Hide
        daryn Daryn Sharp added a comment -

        Java 7 doesn't have java.util.Base64.... Changed the import. Only needed on branch 2.x.

        Show
        daryn Daryn Sharp added a comment - Java 7 doesn't have java.util.Base64.... Changed the import. Only needed on branch 2.x.
        Hide
        kihwal Kihwal Lee added a comment -

        It looks fine. "org.apache.commons.codec.binary.Base64" is the one used everywhere else in common. Not a new dependency.

        Show
        kihwal Kihwal Lee added a comment - It looks fine. "org.apache.commons.codec.binary.Base64" is the one used everywhere else in common. Not a new dependency.
        Hide
        kihwal Kihwal Lee added a comment -

        Committed to branch-2 and branch-2.8.

        Show
        kihwal Kihwal Lee added a comment - Committed to branch-2 and branch-2.8.

          People

          • Assignee:
            daryn Daryn Sharp
            Reporter:
            daryn Daryn Sharp
          • Votes:
            0 Vote for this issue
            Watchers:
            16 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development