WARNING: Use "yarn jar" to launch YARN applications. SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/jars/log4j-slf4j-impl-2.8.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/jars/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Connecting to jdbc:hive2://localhost:10000/lv_test Connected to: Apache Hive (version 2.1.1-cdh6.3.2) Driver: Hive JDBC (version 2.1.1-cdh6.3.2) Transaction isolation: TRANSACTION_REPEATABLE_READ 0: jdbc:hive2://localhost:10000/lv_test> set hive.execution.engine=spark; No rows affected (0.044 seconds) 0: jdbc:hive2://localhost:10000/lv_test> set hive.auto.convert.join=true; No rows affected (0.004 seconds) 0: jdbc:hive2://localhost:10000/lv_test> set hive.spark.dynamic.partition.pruning=true; No rows affected (0.003 seconds) 0: jdbc:hive2://localhost:10000/lv_test> set hive.spark.dynamic.partition.pruning.map.join.only=true; No rows affected (0.003 seconds) 0: jdbc:hive2://localhost:10000/lv_test> select * . . . . . . . . . . . . . . . . . . . .> from (select * . . . . . . . . . . . . . . . . . . . .> from ods_img_gj_sed_prod_liming fp . . . . . . . . . . . . . . . . . . . .> where fp.part_date = 20220525) cfp inner join (select ic.client_id, ic.businsys_no . . . . . . . . . . . . . . . . . . . .> from ods_temp_cdt_increment_client ic where ic.businsys_no=5035) ici on cfp.businsys_no = ici.businsys_no and cfp.acct_id = ici.client_id; INFO : Compiling command(queryId=hive_20220601001521_02b9c045-ef56-4fbf-9006-114e56c49497): select * from (select * from ods_img_gj_sed_prod_liming fp where fp.part_date = 20220525) cfp inner join (select ic.client_id, ic.businsys_no from ods_temp_cdt_increment_client ic where ic.businsys_no=5035) ici on cfp.businsys_no = ici.businsys_no and cfp.acct_id = ici.client_id INFO : Semantic Analysis Completed INFO : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:cfp.init_date, type:decimal(10,0), comment:null), FieldSchema(name:cfp.businsys_no, type:decimal(10,0), comment:null), FieldSchema(name:cfp.acct_id, type:string, comment:null), FieldSchema(name:cfp.organ_prod_kind, type:string, comment:null), FieldSchema(name:cfp.prod_code, type:string, comment:null), FieldSchema(name:cfp.prod_full_name, type:string, comment:null), FieldSchema(name:cfp.prod_name, type:string, comment:null), FieldSchema(name:cfp.prod_enddate, type:decimal(10,0), comment:null), FieldSchema(name:cfp.prod_structure, type:string, comment:null), FieldSchema(name:cfp.prodearnings_character, type:string, comment:null), FieldSchema(name:cfp.prod_scale, type:decimal(15,4), comment:null), FieldSchema(name:cfp.prodpossessor_type, type:string, comment:null), FieldSchema(name:cfp.initleverage_ratio, type:string, comment:null), FieldSchema(name:cfp.finan_futures_account, type:string, comment:null), FieldSchema(name:cfp.prodmana_code, type:string, comment:null), FieldSchema(name:cfp.product_open_period, type:string, comment:null), FieldSchema(name:cfp.product_net_value, type:decimal(15,8), comment:null), FieldSchema(name:cfp.product_nav_date, type:decimal(10,0), comment:null), FieldSchema(name:cfp.remark, type:string, comment:null), FieldSchema(name:cfp.custodian_tel, type:string, comment:null), FieldSchema(name:cfp.prod_register_date, type:decimal(10,0), comment:null), FieldSchema(name:cfp.agency_name, type:string, comment:null), FieldSchema(name:cfp.prod_cust_type, type:string, comment:null), FieldSchema(name:cfp.investadv_flag, type:string, comment:null), FieldSchema(name:cfp.pfunds_type, type:string, comment:null), FieldSchema(name:cfp.pfunds_manage_type, type:string, comment:null), FieldSchema(name:cfp.product_open_info, type:string, comment:null), FieldSchema(name:cfp.pfunds_type_info, type:string, comment:null), FieldSchema(name:cfp.prodtrustee_net_no, type:string, comment:null), FieldSchema(name:cfp.trustee_bank_name, type:string, comment:null), FieldSchema(name:cfp.trustee_bank_account, type:string, comment:null), FieldSchema(name:cfp.trust_fund, type:decimal(15,2), comment:null), FieldSchema(name:cfp.trust_begindate, type:decimal(10,0), comment:null), FieldSchema(name:cfp.trust_enddate, type:decimal(10,0), comment:null), FieldSchema(name:cfp.subscriber_extracode, type:string, comment:null), FieldSchema(name:cfp.extracode, type:string, comment:null), FieldSchema(name:cfp.part_date, type:int, comment:null), FieldSchema(name:ici.client_id, type:string, comment:null), FieldSchema(name:ici.businsys_no, type:decimal(10,0), comment:null)], properties:null) INFO : Completed compiling command(queryId=hive_20220601001521_02b9c045-ef56-4fbf-9006-114e56c49497); Time taken: 0.168 seconds INFO : Executing command(queryId=hive_20220601001521_02b9c045-ef56-4fbf-9006-114e56c49497): select * from (select * from ods_img_gj_sed_prod_liming fp where fp.part_date = 20220525) cfp inner join (select ic.client_id, ic.businsys_no from ods_temp_cdt_increment_client ic where ic.businsys_no=5035) ici on cfp.businsys_no = ici.businsys_no and cfp.acct_id = ici.client_id INFO : Query ID = hive_20220601001521_02b9c045-ef56-4fbf-9006-114e56c49497 INFO : Total jobs = 2 INFO : Launching Job 1 out of 2 INFO : Starting task [Stage-2:MAPRED] in serial mode INFO : 2022-06-01 00:15:35,219 INFO : Spark job[0] finished successfully in 2.01 second(s) INFO : Launching Job 2 out of 2 INFO : Starting task [Stage-1:MAPRED] in serial mode ERROR : Spark job[-1] failed java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.io.FileNotFoundException: File hdfs://nameservice1/tmp/hive/anonymous/744d8b8a-7a19-4337-a80c-9f2005f65fb5/hive_2022-06-01_00-15-21_619_1701186596611571845-7/-mr-10003/2/1 does not exist. at org.apache.hadoop.hive.ql.exec.spark.SparkPlanGenerator.runDynamicPartitionPruner(SparkPlanGenerator.java:162) ~[hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2] at org.apache.hadoop.hive.ql.exec.spark.SparkPlanGenerator.generate(SparkPlanGenerator.java:127) ~[hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2] at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient$JobStatusJob.call(RemoteHiveSparkClient.java:355) ~[hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2] at org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:400) ~[hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2] at org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:365) ~[hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_91] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_91] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_91] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_91] Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.io.FileNotFoundException: File hdfs://nameservice1/tmp/hive/anonymous/744d8b8a-7a19-4337-a80c-9f2005f65fb5/hive_2022-06-01_00-15-21_619_1701186596611571845-7/-mr-10003/2/1 does not exist. at org.apache.hadoop.hive.ql.exec.spark.SparkDynamicPartitionPruner.processFiles(SparkDynamicPartitionPruner.java:161) ~[hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2] at org.apache.hadoop.hive.ql.exec.spark.SparkDynamicPartitionPruner.prune(SparkDynamicPartitionPruner.java:83) ~[hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2] at org.apache.hadoop.hive.ql.exec.spark.SparkPlanGenerator.runDynamicPartitionPruner(SparkPlanGenerator.java:160) ~[hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2] ... 8 more Caused by: java.io.FileNotFoundException: File hdfs://nameservice1/tmp/hive/anonymous/744d8b8a-7a19-4337-a80c-9f2005f65fb5/hive_2022-06-01_00-15-21_619_1701186596611571845-7/-mr-10003/2/1 does not exist. at org.apache.hadoop.hdfs.DistributedFileSystem.listStatusInternal(DistributedFileSystem.java:986) ~[hadoop-hdfs-client-3.0.0-cdh6.3.2.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem.access$1000(DistributedFileSystem.java:122) ~[hadoop-hdfs-client-3.0.0-cdh6.3.2.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1046) ~[hadoop-hdfs-client-3.0.0-cdh6.3.2.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1043) ~[hadoop-hdfs-client-3.0.0-cdh6.3.2.jar:?] at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-3.0.0-cdh6.3.2.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:1053) ~[hadoop-hdfs-client-3.0.0-cdh6.3.2.jar:?] at org.apache.hadoop.hive.ql.exec.spark.SparkDynamicPartitionPruner.processFiles(SparkDynamicPartitionPruner.java:133) ~[hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2] at org.apache.hadoop.hive.ql.exec.spark.SparkDynamicPartitionPruner.prune(SparkDynamicPartitionPruner.java:83) ~[hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2] at org.apache.hadoop.hive.ql.exec.spark.SparkPlanGenerator.runDynamicPartitionPruner(SparkPlanGenerator.java:160) ~[hive-exec-2.1.1-cdh6.3.2.jar:2.1.1-cdh6.3.2] ... 8 more ERROR : FAILED: Execution Error, return code 3 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Spark job failed due to: File hdfs://nameservice1/tmp/hive/anonymous/744d8b8a-7a19-4337-a80c-9f2005f65fb5/hive_2022-06-01_00-15-21_619_1701186596611571845-7/-mr-10003/2/1 does not exist. INFO : Completed executing command(queryId=hive_20220601001521_02b9c045-ef56-4fbf-9006-114e56c49497); Time taken: 14.496 seconds Error: Error while processing statement: FAILED: Execution Error, return code 3 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. Spark job failed due to: File hdfs://nameservice1/tmp/hive/anonymous/744d8b8a-7a19-4337-a80c-9f2005f65fb5/hive_2022-06-01_00-15-21_619_1701186596611571845-7/-mr-10003/2/1 does not exist. (state=42000,code=3)