Uploaded image for project: 'Apache Hudi'
  1. Apache Hudi
  2. HUDI-7408

LSM tree writer failed with compaction

    XMLWordPrintableJSON

Details

    Description

      running single writer with inline compaction for mor

       

      Caused by: java.io.FileNotFoundException: No such file or directory 's3://<path>/.hoodie/metadata/.hoodie/archived/00000000000000010_00000000000000012_0.parquet'
          at com.amazon.ws.emr.hadoop.fs.s3n.S3NativeFileSystem.getFileStatus(S3NativeFileSystem.java:560) ~[emrfs-hadoop-assembly-2.60.0.jar:?]
          at com.amazon.ws.emr.hadoop.fs.EmrFileSystem.getFileStatus(EmrFileSystem.java:623) ~[emrfs-hadoop-assembly-2.60.0.jar:?]
          at org.apache.hudi.common.fs.HoodieWrapperFileSystem.lambda$getFileStatus$17(HoodieWrapperFileSystem.java:410) ~[org.apache.hudi_hudi-spark3.4-bundle_2.12-1.0.0-beta1.jar:1.0.0-beta1]
          at org.apache.hudi.common.fs.HoodieWrapperFileSystem.executeFuncWithTimeMetrics(HoodieWrapperFileSystem.java:114) ~[org.apache.hudi_hudi-spark3.4-bundle_2.12-1.0.0-beta1.jar:1.0.0-beta1]
          at org.apache.hudi.common.fs.HoodieWrapperFileSystem.getFileStatus(HoodieWrapperFileSystem.java:404) ~[org.apache.hudi_hudi-spark3.4-bundle_2.12-1.0.0-beta1.jar:1.0.0-beta1]
          at org.apache.hudi.client.timeline.LSMTimelineWriter.getFileEntry(LSMTimelineWriter.java:309) ~[org.apache.hudi_hudi-spark3.4-bundle_2.12-1.0.0-beta1.jar:1.0.0-beta1]
          at org.apache.hudi.client.timeline.LSMTimelineWriter.updateManifest(LSMTimelineWriter.java:158) ~[org.apache.hudi_hudi-spark3.4-bundle_2.12-1.0.0-beta1.jar:1.0.0-beta1]
          at org.apache.hudi.client.timeline.LSMTimelineWriter.updateManifest(LSMTimelineWriter.java:137) ~[org.apache.hudi_hudi-spark3.4-bundle_2.12-1.0.0-beta1.jar:1.0.0-beta1]
          at org.apache.hudi.client.timeline.LSMTimelineWriter.write(LSMTimelineWriter.java:118) ~[org.apache.hudi_hudi-spark3.4-bundle_2.12-1.0.0-beta1.jar:1.0.0-beta1]
          ... 64 more
      Traceback (most recent call last):
        File "/mnt/tmp/spark-ac2781c4-7207-458f-9a0d-b677b85d93ad/run_experiment_aws.py", line 103, in <module>
          main(args)
        File "/mnt/tmp/spark-ac2781c4-7207-458f-9a0d-b677b85d93ad/run_experiment_aws.py", line 90, in main
          experiment.run_queries(spark, metrics)
        File "/mnt/tmp/spark-ac2781c4-7207-458f-9a0d-b677b85d93ad/onebench.zip/onebench/experiments/experiments.py", line 178, in run_queries
        File "/mnt/tmp/spark-ac2781c4-7207-458f-9a0d-b677b85d93ad/onebench.zip/onebench/experiments/queries.py", line 34, in __call__
        File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 1440, in sql
        File "/usr/lib/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1322, in __call__
        File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/errors/exceptions/captured.py", line 169, in deco
        File "/usr/lib/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/protocol.py", line 326, in get_return_value
      py4j.protocol.Py4JJavaError: An error occurred while calling o100.sql.
      : org.apache.hudi.exception.HoodieException: Failed to instantiate Metadata table
          at org.apache.hudi.client.SparkRDDWriteClient.initializeMetadataTable(SparkRDDWriteClient.java:293)
          at org.apache.hudi.client.SparkRDDWriteClient.initMetadataTable(SparkRDDWriteClient.java:273)
          at org.apache.hudi.client.BaseHoodieWriteClient.doInitTable(BaseHoodieWriteClient.java:1250)
          at org.apache.hudi.client.BaseHoodieWriteClient.initTable(BaseHoodieWriteClient.java:1290)
          at org.apache.hudi.client.SparkRDDWriteClient.upsert(SparkRDDWriteClient.java:139)
          at org.apache.hudi.DataSourceUtils.doWriteOperation(DataSourceUtils.java:224)
          at org.apache.hudi.HoodieSparkSqlWriterInternal.writeInternal(HoodieSparkSqlWriter.scala:506)
          at org.apache.hudi.HoodieSparkSqlWriterInternal.write(HoodieSparkSqlWriter.scala:196)
          at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:121)
          at org.apache.spark.sql.hudi.command.MergeIntoHoodieTableCommand.executeUpsert(MergeIntoHoodieTableCommand.scala:469)
          at org.apache.spark.sql.hudi.command.MergeIntoHoodieTableCommand.run(MergeIntoHoodieTableCommand.scala:283)
          at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
          at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
          at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84)
          at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:104)
          at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:107)
          at org.apache.spark.sql.execution.SQLExecution$.withTracker(SQLExecution.scala:250)
          at org.apache.spark.sql.execution.SQLExecution$.executeQuery$1(SQLExecution.scala:123)
          at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$9(SQLExecution.scala:160)
          at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:107)
          at org.apache.spark.sql.execution.SQLExecution$.withTracker(SQLExecution.scala:250)
          at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$8(SQLExecution.scala:160)
          at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:271)
          at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:159)
          at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:827)
          at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:69)
          at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:101)
          at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:97)
          at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:554)
          at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:107)
          at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:554)
          at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:32)
          at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
          at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
          at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32)
          at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32)
          at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:530)
          at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:97)
          at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:84)
          at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:82)
          at org.apache.spark.sql.Dataset.<init>(Dataset.scala:221)
          at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:101)
          at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:827)
          at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:98)
          at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:640)
          at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:827)
          at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:630)
          at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:662)
          at sun.reflect.GeneratedMethodAccessor108.invoke(Unknown Source)
          at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
          at java.lang.reflect.Method.invoke(Method.java:498)
          at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
          at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374)
          at py4j.Gateway.invoke(Gateway.java:282)
          at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
          at py4j.commands.CallCommand.execute(CallCommand.java:79)
          at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
          at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
          at java.lang.Thread.run(Thread.java:750)
      Caused by: org.apache.hudi.exception.HoodieCommitException: Failed to write commits
          at org.apache.hudi.client.timeline.LSMTimelineWriter.write(LSMTimelineWriter.java:120)
          at org.apache.hudi.client.timeline.HoodieTimelineArchiver.archiveIfRequired(HoodieTimelineArchiver.java:112)
          at org.apache.hudi.client.BaseHoodieTableServiceClient.archive(BaseHoodieTableServiceClient.java:788)
          at org.apache.hudi.client.BaseHoodieWriteClient.archive(BaseHoodieWriteClient.java:885)
          at org.apache.hudi.client.BaseHoodieWriteClient.archive(BaseHoodieWriteClient.java:895)
          at org.apache.hudi.metadata.HoodieBackedTableMetadataWriter.performTableServices(HoodieBackedTableMetadataWriter.java:1325)
          at org.apache.hudi.client.SparkRDDWriteClient.initializeMetadataTable(SparkRDDWriteClient.java:290)
          ... 58 more
      Caused by: java.io.FileNotFoundException: No such file or directory 's3://<basepath>/.hoodie/metadata/.hoodie/archived/00000000000000010_00000000000000012_0.parquet'
          at com.amazon.ws.emr.hadoop.fs.s3n.S3NativeFileSystem.getFileStatus(S3NativeFileSystem.java:560)
          at com.amazon.ws.emr.hadoop.fs.EmrFileSystem.getFileStatus(EmrFileSystem.java:623)
          at org.apache.hudi.common.fs.HoodieWrapperFileSystem.lambda$getFileStatus$17(HoodieWrapperFileSystem.java:410)
          at org.apache.hudi.common.fs.HoodieWrapperFileSystem.executeFuncWithTimeMetrics(HoodieWrapperFileSystem.java:114)
          at org.apache.hudi.common.fs.HoodieWrapperFileSystem.getFileStatus(HoodieWrapperFileSystem.java:404)
          at org.apache.hudi.client.timeline.LSMTimelineWriter.getFileEntry(LSMTimelineWriter.java:309)
          at org.apache.hudi.client.timeline.LSMTimelineWriter.updateManifest(LSMTimelineWriter.java:158)
          at org.apache.hudi.client.timeline.LSMTimelineWriter.updateManifest(LSMTimelineWriter.java:137)
          at org.apache.hudi.client.timeline.LSMTimelineWriter.write(LSMTimelineWriter.java:118)
          ... 64 more
      

       

       

      Attachments

        Activity

          People

            danny0405 Danny Chen
            xushiyan Shiyan Xu
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: