Details
Description
when inserting to table using dynamic partitions with spark.speculation=true and there is a skew data of some partitions trigger the speculative tasks ,it will throws the exception like
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException): Lease mismatch on /tmp/hive-jeanlyn/hive_2015-06-15_15-20-44_734_8801220787219172413-1/-ext-10000/ds=2015-06-15/type=2/part-00301.lzo owned by DFSClient_attempt_201506031520_0011_m_000189_0_-1513487243_53 but is accessed by DFSClient_attempt_201506031520_0011_m_000042_0_-1275047721_57
Attachments
Issue Links
- is duplicated by
-
SPARK-6067 Spark sql hive dynamic partitions job will fail if task fails
- Resolved
- links to