Description
We added unhandledFilters interface to SPARK-10978. So, a data source has a chance to let Spark SQL know that for those returned filters, it is possible that the data source will not apply them to every row. So, Spark SQL should use a Filter operator to evaluate those filters. However, if a filter is a part of returned unhandledFilters, we should still push it down. For example, our internal data sources do not override this method, if we do not push down those filters, we are actually turning off the filter pushdown feature.
Attachments
Issue Links
- is duplicated by
-
SPARK-11621 ORC filter pushdown not working properly after new unhandled filter interface.
- Resolved
- is related to
-
SPARK-11621 ORC filter pushdown not working properly after new unhandled filter interface.
- Resolved
-
SPARK-10978 Allow PrunedFilterScan to eliminate predicates from further evaluation
- Resolved
- links to