Uploaded image for project: 'Apache Hudi'
  1. Apache Hudi
  2. HUDI-5259

Spark reports an error when modifying the schema of the Flink table

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • None
    • 0.12.1
    • flink-sql, spark-sql
    • None

    Description

      Spark reports an error when modifying the schema of the Flink table

       

      • flink create table and write data
      CREATE CATALOG myhudi WITH(
          'type' = 'hudi',
          'default-database' = 'default',
          'catalog.path' = '/user/hdpu/warehouse',
          'mode' = 'hms',
          'hive.conf.dir' = 'hdfs:///user/hdpu/streamx/conf_data/hive_conf'
      );
      
      CREATE CATALOG test WITH (
        'type' = 'extra_memory'
      );
      
      
      CREATE CATALOG myhive WITH (
        'type' = 'hive',
        'default-database' = 'default',
        'hive-conf-dir' = 'hdfs:///user/hdpu/streamx/conf_data/hive_conf'
      );
      
      drop table if exists  myhudi.test_hudi3.hudi_test19;
      
      create table if not exists myhudi.test_hudi3.hudi_test19
      (id bigint not null, `name` string  not null, ts bigint
          ,PRIMARY KEY (`id`) NOT ENFORCED
          )
          with('connector' = 'hudi',
              'table.type' = 'COPY_ON_WRITE',
              'index.type'= 'BUCKET',
              'precombine.field' = 'ts');
      
      show create table myhudi.test_hudi3.hudi_test19;
      
      insert into myhudi.test_hudi3.hudi_test19
      values
          (1, '888', 43);
      
      

       

       

      •           spark alter table schema
      alter table test_hudi3.hudi_test19 add column col2 int; 
      •    error is 

       

      22/11/22 16:09:20 WARN HiveExternalCatalog: Could not alter schema of table `test_hudi3`.`hudi_test19` in a Hive compatible way. Updating Hive metastore in Spark SQL specific format.
      java.lang.reflect.InvocationTargetException
          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
          at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
          at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
          at java.lang.reflect.Method.invoke(Method.java:498)
          at org.apache.spark.sql.hive.client.Shim_v2_1.alterTable(HiveShim.scala:1303)
          at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$alterTableDataSchema$1(HiveClientImpl.scala:605)
          at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
          at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:305)
          at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:236)
          at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:235)
          at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:285)
          at org.apache.spark.sql.hive.client.HiveClientImpl.alterTableDataSchema(HiveClientImpl.scala:586)
          at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$alterTableDataSchema$1(HiveExternalCatalog.scala:686)
          at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
          at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:102)
          at org.apache.spark.sql.hive.HiveExternalCatalog.alterTableDataSchema(HiveExternalCatalog.scala:672)
          at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.alterTableDataSchema(ExternalCatalogWithListener.scala:124)
          at org.apache.spark.sql.catalyst.catalog.SessionCatalog.alterTableDataSchema(SessionCatalog.scala:459)
          at org.apache.spark.sql.hudi.command.AlterHoodieTableAddColumnsCommand.refreshSchemaInMeta(AlterHoodieTableAddColumnsCommand.scala:90)
          at org.apache.spark.sql.hudi.command.AlterHoodieTableAddColumnsCommand.run(AlterHoodieTableAddColumnsCommand.scala:70)
          at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
          at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
          at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84)
          at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:110)
          at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
          at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
          at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
          at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
          at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
          at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:110)
          at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:106)
          at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:481)
          at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:82)
          at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:481)
          at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
          at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
          at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
          at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
          at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
          at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:457)
          at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:106)
          at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:93)
          at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:91)
          at org.apache.spark.sql.Dataset.<init>(Dataset.scala:219)
          at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
          at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
          at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
          at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:618)
          at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
          at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613)
          at com.grg.spark.catalogs.test.SqlUtils.exeucteSqlFile2(SqlUtils.java:54)
          at com.grg.spark.catalogs.test.TestSpark.test1(TestSpark.java:29)
          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
          at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
          at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
          at java.lang.reflect.Method.invoke(Method.java:498)
          at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
          at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
          at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
          at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
          at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
          at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
          at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
          at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
          at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
          at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
          at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
          at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
          at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
          at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
          at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
          at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:69)
          at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33)
          at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:235)
          at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:54)
      Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter table. The following columns have types incompatible with the existing columns in their respective positions :
      id,_extra_test,col2
          at org.apache.hadoop.hive.ql.metadata.Hive.alterTable(Hive.java:634)
          at org.apache.hadoop.hive.ql.metadata.Hive.alterTable(Hive.java:612)
          ... 75 more
      Caused by: InvalidOperationException(message:The following columns have types incompatible with the existing columns in their respective positions :
      id,_extra_test,col2)
          at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$alter_table_with_environment_context_result$alter_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:59744)
          at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$alter_table_with_environment_context_result$alter_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:59730)
          at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$alter_table_with_environment_context_result.read(ThriftHiveMetastore.java:59672)
          at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:88)
          at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_alter_table_with_environment_context(ThriftHiveMetastore.java:1693)
          at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.alter_table_with_environment_context(ThriftHiveMetastore.java:1677)
          at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.alter_table_with_environmentContext(HiveMetaStoreClient.java:373)
          at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.alter_table_with_environmentContext(SessionHiveMetaStoreClient.java:322)
          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
          at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
          at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
          at java.lang.reflect.Method.invoke(Method.java:498)
          at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:173)
          at com.sun.proxy.$Proxy25.alter_table_with_environmentContext(Unknown Source)
          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
          at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
          at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
          at java.lang.reflect.Method.invoke(Method.java:498)
          at org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2327)
          at com.sun.proxy.$Proxy25.alter_table_with_environmentContext(Unknown Source)
          at org.apache.hadoop.hive.ql.metadata.Hive.alterTable(Hive.java:630)
          ... 76 moreorg.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter table. The following columns have types incompatible with the existing columns in their respective positions :
      col    at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:111)
          at org.apache.spark.sql.hive.HiveExternalCatalog.alterTableDataSchema(HiveExternalCatalog.scala:672)
          at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.alterTableDataSchema(ExternalCatalogWithListener.scala:124)
          at org.apache.spark.sql.catalyst.catalog.SessionCatalog.alterTableDataSchema(SessionCatalog.scala:459)
          at org.apache.spark.sql.hudi.command.AlterHoodieTableAddColumnsCommand.refreshSchemaInMeta(AlterHoodieTableAddColumnsCommand.scala:90)
          at org.apache.spark.sql.hudi.command.AlterHoodieTableAddColumnsCommand.run(AlterHoodieTableAddColumnsCommand.scala:70)
          at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
          at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
          at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84)
          at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:110)
          at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
          at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
          at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
          at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
          at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
          at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:110)
          at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:106)
          at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:481)
          at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:82)
          at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:481)
          at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30)
          at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
          at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
          at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
          at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30)
          at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:457)
          at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:106)
          at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:93)
          at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:91)
          at org.apache.spark.sql.Dataset.<init>(Dataset.scala:219)
          at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
          at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
          at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
          at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:618)
          at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
          at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613)
          at com.grg.spark.catalogs.test.SqlUtils.exeucteSqlFile2(SqlUtils.java:54)
          at com.grg.spark.catalogs.test.TestSpark.test1(TestSpark.java:29)
          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
          at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
          at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
          at java.lang.reflect.Method.invoke(Method.java:498)
          at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
          at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
          at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
          at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
          at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
          at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
          at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
          at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
          at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
          at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
          at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
          at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
          at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
          at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
          at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
          at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:69)
          at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33)
          at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:235)
          at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:54)
      Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter table. The following columns have types incompatible with the existing columns in their respective positions :
      col
          at org.apache.hadoop.hive.ql.metadata.Hive.alterTable(Hive.java:634)
          at org.apache.hadoop.hive.ql.metadata.Hive.alterTable(Hive.java:612)
          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
          at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
          at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
          at java.lang.reflect.Method.invoke(Method.java:498)
          at org.apache.spark.sql.hive.client.Shim_v2_1.alterTable(HiveShim.scala:1303)
          at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$alterTableDataSchema$1(HiveClientImpl.scala:605)
          at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
          at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:305)
          at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:236)
          at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:235)
          at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:285)
          at org.apache.spark.sql.hive.client.HiveClientImpl.alterTableDataSchema(HiveClientImpl.scala:586)
          at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$alterTableDataSchema$1(HiveExternalCatalog.scala:693)
          at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
          at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:102)
          ... 60 more
      Caused by: InvalidOperationException(message:The following columns have types incompatible with the existing columns in their respective positions :
      col)
          at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$alter_table_with_environment_context_result$alter_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:59744)
          at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$alter_table_with_environment_context_result$alter_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:59730)
          at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$alter_table_with_environment_context_result.read(ThriftHiveMetastore.java:59672)
          at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:88)
          at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_alter_table_with_environment_context(ThriftHiveMetastore.java:1693)
          at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.alter_table_with_environment_context(ThriftHiveMetastore.java:1677)
          at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.alter_table_with_environmentContext(HiveMetaStoreClient.java:373)
          at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.alter_table_with_environmentContext(SessionHiveMetaStoreClient.java:322)
          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
          at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
          at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
          at java.lang.reflect.Method.invoke(Method.java:498)
          at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:173)
          at com.sun.proxy.$Proxy25.alter_table_with_environmentContext(Unknown Source)
          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
          at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
          at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
          at java.lang.reflect.Method.invoke(Method.java:498)
          at org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2327)
          at com.sun.proxy.$Proxy25.alter_table_with_environmentContext(Unknown Source)
          at org.apache.hadoop.hive.ql.metadata.Hive.alterTable(Hive.java:630)
          ... 76 more22/11/22 16:09:20 INFO SparkContext: Invoking stop() from shutdown hook
      22/11/22 16:09:20 INFO AbstractConnector: Stopped Spark@3fcdcf{HTTP/1.1, (http/1.1)}{0.0.0.0:4040}
      22/11/22 16:09:20 INFO SparkUI: Stopped Spark web UI at http://cc:4040
      22/11/22 16:09:20 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
      22/11/22 16:09:20 INFO MemoryStore: MemoryStore cleared
      22/11/22 16:09:20 INFO BlockManager: BlockManager stopped
      22/11/22 16:09:20 INFO BlockManagerMaster: BlockManagerMaster stopped
      22/11/22 16:09:20 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
      22/11/22 16:09:20 INFO SparkContext: Successfully stopped SparkContext
      22/11/22 16:09:20 INFO ShutdownHookManager: Shutdown hook called
      22/11/22 16:09:20 INFO ShutdownHookManager: Deleting directory C:\Users\chenchao4\AppData\Local\Temp\spark-4d035759-7a41-42a0-87a4-8ca5b5eaa55fProcess finished with exit code -1
       

      Attachments

        Activity

          People

            Unassigned Unassigned
            waywtdcc waywtdcc
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated: