Description
This issue aims to fix an ORC performance regression at Spark 2.4.0 RCs from Spark 2.3.2. For column names with `.`, the pushed predicates are ignored.
Test Data
scala> val df = spark.range(Int.MaxValue).sample(0.2).toDF("col.with.dot") scala> df.write.mode("overwrite").orc("/tmp/orc")
Spark 2.3.2
scala> spark.sql("set spark.sql.orc.impl=native") scala> spark.sql("set spark.sql.orc.filterPushdown=true") scala> spark.time(spark.read.orc("/tmp/orc").where("`col.with.dot` < 10").show) +------------+ |col.with.dot| +------------+ | 1| | 8| +------------+ Time taken: 1486 ms scala> spark.time(spark.read.orc("/tmp/orc").where("`col.with.dot` < 10").show) +------------+ |col.with.dot| +------------+ | 1| | 8| +------------+ Time taken: 163 ms
Spark 2.4.0 RC2
scala> spark.time(spark.read.orc("/tmp/orc").where("`col.with.dot` < 10").show) +------------+ |col.with.dot| +------------+ | 1| | 8| +------------+ Time taken: 4087 ms scala> spark.time(spark.read.orc("/tmp/orc").where("`col.with.dot` < 10").show) +------------+ |col.with.dot| +------------+ | 1| | 8| +------------+ Time taken: 1998 ms