Description
build with scala-2.12 with following steps
1. change the pom.xml with scala-2.12
./dev/change-scala-version.sh 2.12
2.build with -Pscala-2.12
for hive on spark
./dev/make-distribution.sh --tgz -Pscala-2.12 -Phadoop-2.7 -Pyarn -Pparquet-provided -Dhadoop.version=2.7.3
for spark sql
./dev/make-distribution.sh --tgz -Pscala-2.12 -Phadoop-2.7 -Pyarn -Phive -Dhadoop.version=2.7.3>log.sparksql 2>&1
get following error
#Error1
/common/unsafe/src/main/java/org/apache/spark/unsafe/Platform.java:172: error: cannot find symbol Cleaner cleaner = Cleaner.create(buffer, () -> freeMemory(memory));
This is because sun.misc.Cleaner has been moved to new location in JDK9. HADOOP-12760 will be the long term fix
#Error2
spark_source/core/src/main/scala/org/apache/spark/executor/Executor.scala:455: ambiguous reference to overloaded definition, method limit in class ByteBuffer of type (x$1: Int)java.nio.ByteBuffer method limit in class Buffer of type ()Int match expected type ? val resultSize = serializedDirectResult.limit error
The limit method was moved from ByteBuffer to the superclass Buffer and it can no longer be called without (). The same reason for position method.
#Error3
home/zly/prj/oss/jdk9_HOS_SOURCE/spark_source/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformationExec.scala:415: ambiguous reference to overloaded definition, [error] both method putAll in class Properties of type (x$1: java.util.Map[_, _])Unit [error] and method putAll in class Hashtable of type (x$1: java.util.Map[_ <: Object, _ <: Object])Unit [error] match argument types (java.util.Map[String,String]) [error] properties.putAll(propsMap.asJava) [error] ^ [error] /home/zly/prj/oss/jdk9_HOS_SOURCE/spark_source/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformationExec.scala:427: ambiguous reference to overloaded definition, [error] both method putAll in class Properties of type (x$1: java.util.Map[_, _])Unit [error] and method putAll in class Hashtable of type (x$1: java.util.Map[_ <: Object, _ <: Object])Unit [error] match argument types (java.util.Map[String,String]) [error] props.putAll(outputSerdeProps.toMap.asJava) [error] ^
This is because the key type is Object instead of String which is unsafe.
After solving these 3 errors, compile successfully.
Attachments
Issue Links
- is duplicated by
-
SPARK-22687 Run spark-sql in scala-2.12 and JDK9
- Resolved
-
SPARK-22661 Fix the putAll compile error when compiling with scala-2.12 and jdk9
- Resolved
- links to