Details
Description
Binary type is mapped to Array[Byte], which can't be compared with == directly. However, EqualTo.eval() uses plain == to compare values. Run the following spark-shell snippet with Spark 1.2.0 to reproduce this issue:
import org.apache.spark.sql.SQLContext import sc._ val sqlContext = new SQLContext(sc) import sqlContext._ case class KV(key: Int, value: Array[Byte]) def toBinary(s: String): Array[Byte] = s.toString.getBytes("UTF-8") registerFunction("toBinary", toBinary _) parallelize(1 to 1024).map(i => KV(i, toBinary(i.toString))).registerTempTable("bin") // OK sql("select * from bin where value < toBinary('100')").collect() // Oops, returns nothing sql("select * from bin where value = toBinary('100')").collect()