Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-14070

Use ORC data source for SQL queries on ORC tables

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 1.6.1
    • 2.0.0
    • SQL
    • None

    Description

      Currently if one is trying to query ORC tables in Hive, the plan generated by Spark hows that its using the `HiveTableScan` operator which is generic to all file formats. We could instead use the ORC data source for this so that we can get ORC specific optimizations like predicate pushdown.

      Current behaviour:

      ```
      scala> hqlContext.sql("SELECT * FROM orc_table").explain(true)
      == Parsed Logical Plan ==
      'Project [unresolvedalias(*, None)]
      +- 'UnresolvedRelation `orc_table`, None

      == Analyzed Logical Plan ==
      key: string, value: string
      Project key#171,value#172
      +- MetastoreRelation default, orc_table, None

      == Optimized Logical Plan ==
      MetastoreRelation default, orc_table, None

      == Physical Plan ==
      HiveTableScan key#171,value#172, MetastoreRelation default, orc_table, None
      ```

      Attachments

        Issue Links

          Activity

            People

              tejasp Tejas Patil
              tejasp Tejas Patil
              Michael Armbrust Michael Armbrust
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: