Description
Since MiniKdc version lower than hadoop-3.0 can't work well in jdk11.
New encryption types of es128-cts-hmac-sha256-128 and aes256-cts-hmac-sha384-192 (for Kerberos 5) enabled by default were added in Java 11, while version of MiniKdc under 3.0.0 used by Spark does not support these encryption types and does not work well when these encryption types are enabled, which results in the authentication failure.
Hadoop jira: https://issues.apache.org/jira/browse/HADOOP-12911
In this jira, the author said to replace origin Apache Directory project which is not maintained (but not said it won't work well in jdk11) to Apache Kerby which is java binding(fit java version).
And in Flink: apache/flink#9622
Author show the reason why hadoop-2.7.2's MminiKdc failed with jdk11.
Because new encryption types of es128-cts-hmac-sha256-128 and aes256-cts-hmac-sha384-192 (for Kerberos 5) enabled by default were added in Java 11.
Spark with hadoop-2.7's MiniKdcdoes not support these encryption types and does not work well when these encryption types are enabled, which results in the authentication failure.
And when I test hadoop-2.7.2's minikdc in local, the kerberos 's debug error message is read message stream failed, message can't match.
Attachments
Issue Links
- is related to
-
HBASE-27564 Add default encryption type for MiniKDC to fix failed tests on JDK11+
- Resolved
- relates to
-
KAFKA-7338 Fix SASL Kerberos tests with Java 11
- Resolved
-
FLINK-13516 YARNSessionFIFOSecuredITCase fails on Java 11
- Closed
-
HADOOP-12911 Upgrade Hadoop MiniKDC with Kerby
- Resolved
- links to