Description
This is an umbrella JIRA for Apache Spark to support JDK11
As JDK8 is reaching EOL, and JDK9 and 10 are already end of life, per community discussion, we will skip JDK9 and 10 to support JDK 11 directly.
Attachments
Issue Links
- contains
-
SPARK-33999 Make sbt unidoc success with JDK11
- Resolved
- is duplicated by
-
SPARK-24475 Nested JSON count() Exception
- Resolved
-
SPARK-27537 spark-2.4.1/mlib-local/src/test/scala/org/apache/spark/ml/linalg/MatricesSuite.scala.866:Value size is not a member of Object
- Resolved
-
SPARK-27538 sparksql could not start in jdk11, exception org.datanucleus.exceptions.NucleusException: The java type java.lang.Long (jdbc-type='', sql-type="") cant be mapped for this datastore. No mapping is available.
- Resolved
-
SPARK-31531 sun.misc.Cleaner sun.nio.ch.DirectBuffer.cleaner() method not found during spark-submit
- Resolved
-
SPARK-25820 Spark build fails with Java 9
- Resolved
- is related to
-
SPARK-29173 Benchmark JDK 11 performance with FilterPushdownBenchmark
- Resolved
-
SPARK-26427 Upgrade Apache ORC to 1.5.4
- Resolved
-
SPARK-29194 JDK11 QA
- Resolved
- is required by
-
SPARK-28596 Use Java 8 time API in date_trunc
- Open
- relates to
-
HADOOP-15338 Java 11 runtime support
- Resolved
-
KAFKA-7264 Initial Kafka support for Java 11
- Resolved
-
PARQUET-1590 [parquet-format] Add Java 11 to Travis
- Resolved
-
HADOOP-10848 Cleanup calling of sun.security.krb5.Config
- Resolved
-
SPARK-28684 Hive module support JDK 11
- Resolved
-
SPARK-25757 Upgrade netty-all from 4.1.17.Final to 4.1.30.Final
- Resolved