Description
Spark SQL returns NULL for a column whose Hive metastore schema and Parquet schema are in different letter cases, regardless of spark.sql.caseSensitive set to true or false.
Here is a simple example to reproduce this issue:
scala> spark.range(5).toDF.write.mode("overwrite").saveAsTable("t1")
spark-sql> show create table t1;
CREATE TABLE `t1` (`id` BIGINT)
USING parquet
OPTIONS (
`serialization.format` '1'
)
spark-sql> CREATE TABLE `t2` (`ID` BIGINT)
> USING parquet
> LOCATION 'hdfs://localhost/user/hive/warehouse/t1';
spark-sql> select * from t1;
0
1
2
3
4
spark-sql> select * from t2;
NULL
NULL
NULL
NULL
NULL
Attachments
Issue Links
- is related to
-
SPARK-25206 wrong records are returned when Hive metastore schema and parquet schema are in different letter cases
- Resolved
-
SPARK-25175 Field resolution should fail if there's ambiguity for ORC native reader
- Resolved
- links to