Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Incomplete
-
2.3.0
-
None
Description
When inserting overwrite a data source table, Spark firstly deletes all the partitions. For non-partitioned table, it will delete the table folder, which is wrong because table folder may contain information like ACL entries.