Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-22660

Use position() and limit() to fix ambiguity issue in scala-2.12

    Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Minor
    • Resolution: Fixed
    • Affects Version/s: 2.2.0
    • Fix Version/s: 2.3.0
    • Component/s: Build
    • Labels:
      None

      Description

      build with scala-2.12 with following steps
      1. change the pom.xml with scala-2.12
      ./dev/change-scala-version.sh 2.12
      2.build with -Pscala-2.12
      for hive on spark

      ./dev/make-distribution.sh   --tgz -Pscala-2.12 -Phadoop-2.7  -Pyarn -Pparquet-provided -Dhadoop.version=2.7.3
      

      for spark sql

      ./dev/make-distribution.sh  --tgz -Pscala-2.12 -Phadoop-2.7  -Pyarn -Phive -Dhadoop.version=2.7.3>log.sparksql 2>&1
      

      get following error
      #Error1

      /common/unsafe/src/main/java/org/apache/spark/unsafe/Platform.java:172: error: cannot find       symbol
          Cleaner cleaner = Cleaner.create(buffer, () -> freeMemory(memory));
      

      This is because sun.misc.Cleaner has been moved to new location in JDK9. HADOOP-12760 will be the long term fix

      #Error2

      spark_source/core/src/main/scala/org/apache/spark/executor/Executor.scala:455: ambiguous reference to overloaded definition, method limit in class ByteBuffer of type (x$1: Int)java.nio.ByteBuffer
      method limit in class Buffer of type ()Int
      match expected type ?
           val resultSize = serializedDirectResult.limit
      error                                         
      

      The limit method was moved from ByteBuffer to the superclass Buffer and it can no longer be called without (). The same reason for position method.

      #Error3

      home/zly/prj/oss/jdk9_HOS_SOURCE/spark_source/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformationExec.scala:415: ambiguous reference to overloaded definition, [error] both method putAll in class Properties of type (x$1: java.util.Map[_, _])Unit [error] and  method putAll in class Hashtable of type (x$1: java.util.Map[_ <: Object, _ <: Object])Unit [error] match argument types (java.util.Map[String,String])
       [error]     properties.putAll(propsMap.asJava)
       [error]                ^
      [error] /home/zly/prj/oss/jdk9_HOS_SOURCE/spark_source/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformationExec.scala:427: ambiguous reference to overloaded definition, [error] both method putAll in class Properties of type (x$1: java.util.Map[_, _])Unit [error] and  method putAll in class Hashtable of type (x$1: java.util.Map[_ <: Object, _ <: Object])Unit [error] match argument types (java.util.Map[String,String])
       [error]       props.putAll(outputSerdeProps.toMap.asJava)
       [error]             ^
       

      This is because the key type is Object instead of String which is unsafe.

      After solving these 3 errors, compile successfully.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                kellyzly liyunzhang
                Reporter:
                kellyzly liyunzhang
              • Votes:
                0 Vote for this issue
                Watchers:
                3 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: