Details
-
Sub-task
-
Status: Resolved
-
Minor
-
Resolution: Fixed
-
4.0.0
Description
val rootClassLoader: ClassLoader = if (SystemUtils.isJavaVersionAtLeast(JavaVersion.JAVA_9)) { // In Java 9, the boot classloader can see few JDK classes. The intended parent // classloader for delegation is now the platform classloader. // See http://java9.wtf/class-loading/ val platformCL = classOf[ClassLoader].getMethod("getPlatformClassLoader"). invoke(null).asInstanceOf[ClassLoader] // Check to make sure that the root classloader does not know about Hive. assert(Try(platformCL.loadClass("org.apache.hadoop.hive.conf.HiveConf")).isFailure) platformCL } else { // The boot classloader is represented by null (the instance itself isn't accessible) // and before Java 9 can see all JDK classes null }
Spark 4.0.0 has a minimum requirement of Java 17, so the version check for Java 9 is not necessary.
Attachments
Issue Links
- links to