Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-29194 JDK11 QA
  3. SPARK-29957

Reset MiniKDC's default enctypes to fit jdk8/jdk11

    XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.0.0
    • 3.0.0
    • Tests
    • None

    Description

      Since MiniKdc version lower than hadoop-3.0 can't work well in jdk11.
      New encryption types of es128-cts-hmac-sha256-128 and aes256-cts-hmac-sha384-192 (for Kerberos 5) enabled by default were added in Java 11, while version of MiniKdc under 3.0.0 used by Spark does not support these encryption types and does not work well when these encryption types are enabled, which results in the authentication failure.


      Hadoop jira: https://issues.apache.org/jira/browse/HADOOP-12911
      In this jira, the author said to replace origin Apache Directory project which is not maintained (but not said it won't work well in jdk11) to Apache Kerby which is java binding(fit java version).

      And in Flink: apache/flink#9622
      Author show the reason why hadoop-2.7.2's MminiKdc failed with jdk11.
      Because new encryption types of es128-cts-hmac-sha256-128 and aes256-cts-hmac-sha384-192 (for Kerberos 5) enabled by default were added in Java 11.
      Spark with hadoop-2.7's MiniKdcdoes not support these encryption types and does not work well when these encryption types are enabled, which results in the authentication failure.

      And when I test hadoop-2.7.2's minikdc in local, the kerberos 's debug error message is read message stream failed, message can't match.

      Attachments

        Issue Links

          Activity

            People

              angerszhuuu angerszhu
              angerszhuuu angerszhu
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: