Details
-
Improvement
-
Status: Resolved
-
Minor
-
Resolution: Won't Fix
-
3.0.0
-
None
-
None
Description
In the same APP, TableA and TableB are both hive Parquet tables, but TableA can't use the built-in Parquet reader and writer.
In this situation, spark.sql.hive.convertMetastoreParquet can't control this well, so I think we can add a fine-grained configuration to handle this case
Attachments
Issue Links
- links to