Details
-
Sub-task
-
Status: In Progress
-
Major
-
Resolution: Unresolved
-
3.1.0
-
None
-
None
Description
The partition value '_HIVE_DEFAULT_PARTITION_' should be handled as null - the same as DSv1 does.
For example in DSv1:
spark-sql> CREATE TABLE tbl11 (id int, part0 string) USING parquet PARTITIONED BY (part0); spark-sql> ALTER TABLE tbl11 ADD PARTITION (part0 = '__HIVE_DEFAULT_PARTITION__'); spark-sql> INSERT INTO tbl11 PARTITION (part0='__HIVE_DEFAULT_PARTITION__') SELECT 1; spark-sql> SELECT * FROM tbl11; 1 NULL