Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-27238

In the same APP, maybe some hive Parquet(ORC) tables can't use the built-in Parquet(ORC) reader and writer

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Won't Fix
    • 3.0.0
    • None
    • SQL
    • None

    Description

      In the same APP, TableA and TableB are both hive Parquet tables, but TableA can't use the built-in Parquet reader and writer.

      In this situation,  spark.sql.hive.convertMetastoreParquet can't control this well, so I think we can add a fine-grained configuration to handle this case

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              10110346 liuxian
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: