Description
The `spark.sql.sources.partitionOverwriteMode` allows us to overwrite the existing data of the table through staticmode, but for hive table, it is disastrous. It may deleting all data in hive partitioned table while writing with dynamic overwrite and `partitionOverwriteMode=STATIC`.
Here we add a check for this and throw Exception if this happends.