Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
None
Description
Steps to reproduce:
- create external table src (col int) partitioned by (year int);
- create external table dest (col int) partitioned by (year int);
- insert into src partition (year=2022) values (1);
- insert into dest partition (year=2022) values (2);
- hdfs dfs -rm -r ${hive.metastore.warehouse.external.dir}/dest/year=2022
- insert overwrite table dest select * from src;
We will get FileNotFoundException as below.
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Directory file:/home/yuwen/workdir/upstream/hive/itests/qtest/target/localfs/warehouse/ext_part/par=1 could not be cleaned up. at org.apache.hadoop.hive.ql.metadata.Hive.deleteOldPathForReplace(Hive.java:5387) at org.apache.hadoop.hive.ql.metadata.Hive.replaceFiles(Hive.java:5282) at org.apache.hadoop.hive.ql.metadata.Hive.loadPartitionInternal(Hive.java:2657) at org.apache.hadoop.hive.ql.metadata.Hive.lambda$loadDynamicPartitions$6(Hive.java:3143) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)
It is because it call listStatus on a path doesn't exist. We should not fail insert overwrite because there is nothing to be clean up.
fs.listStatus(path, pathFilter)