Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-5509

EqualTo operator doesn't handle binary type properly

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 1.1.0, 1.1.1, 1.2.0, 1.2.1, 1.3.0
    • 1.3.0
    • SQL
    • None

    Description

      Binary type is mapped to Array[Byte], which can't be compared with == directly. However, EqualTo.eval() uses plain == to compare values. Run the following spark-shell snippet with Spark 1.2.0 to reproduce this issue:

      import org.apache.spark.sql.SQLContext
      import sc._
      
      val sqlContext = new SQLContext(sc)
      import sqlContext._
      
      case class KV(key: Int, value: Array[Byte])
      
      def toBinary(s: String): Array[Byte] = s.toString.getBytes("UTF-8")
      registerFunction("toBinary", toBinary _)
      
      parallelize(1 to 1024).map(i => KV(i, toBinary(i.toString))).registerTempTable("bin")
      
      // OK
      sql("select * from bin where value < toBinary('100')").collect()
      
      // Oops, returns nothing
      sql("select * from bin where value = toBinary('100')").collect()
      

      Attachments

        Activity

          People

            lian cheng Cheng Lian
            lian cheng Cheng Lian
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: