Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
None
Description
I have test write hudi with COW & MOR write type. When set HOODIE_AUTO_COMMIT_PROP to true, a FileAlreadyExistsException throw out:
Exception in thread "main" org.apache.hudi.exception.HoodieIOException: Failed to create file /tmp/hudi/tbl_price_cow1/.hoodie/20201202104150.commitException in thread "main" org.apache.hudi.exception.HoodieIOException: Failed to create file /tmp/hudi/tbl_price_cow1/.hoodie/20201202104150.commit at org.apache.hudi.common.table.timeline.HoodieActiveTimeline.createImmutableFileInPath(HoodieActiveTimeline.java:474) at org.apache.hudi.common.table.timeline.HoodieActiveTimeline.transitionState(HoodieActiveTimeline.java:350) at org.apache.hudi.common.table.timeline.HoodieActiveTimeline.transitionState(HoodieActiveTimeline.java:325) at org.apache.hudi.common.table.timeline.HoodieActiveTimeline.saveAsComplete(HoodieActiveTimeline.java:144) at org.apache.hudi.client.AbstractHoodieWriteClient.commitStats(AbstractHoodieWriteClient.java:181) at org.apache.hudi.client.SparkRDDWriteClient.commit(SparkRDDWriteClient.java:101) at org.apache.hudi.HoodieSparkSqlWriter$.commitAndPerformPostOperations(HoodieSparkSqlWriter.scala:413) at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:210) at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:125) at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:46) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68) at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86) at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:137) at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:133) at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:161) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:158) at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:133)
It seems that ../.hoodie/20201202104150.commit has commit twice and result in this exception.
Attachments
Issue Links
- links to