Description
When running spark-shell with the configuration "spark.sql.orc.filterPushdown=true", the DataFrame function where and filter have an error. In particular, "column is not null" fails.
Example Code:
import sqlContext.implicits._
case class MyData(string_field: String, array_field: Seq[String])
val myDataArray = Array(
MyData("foo", Seq("bar")),
MyData("foobar", null)
)
val myDataDF = sc.parallelize(myDataArray).toDF
myDataDF.count // 2
myDataDF.where("array_field is null").count // 1
myDataDF.where("array_field is not null").count // 1
myDataDF.write.format("orc").save("/tmp/mydata.orc")
val myLoadedDataDF = sqlContext.read.format("orc").load("/tmp/mydata.orc")
myLoadedDataDF.count // 2
myLoadedDataDF.where("array_field is not null").count // 0 incorrect