Uploaded image for project: 'Hive'
  1. Hive
  2. HIVE-25965

SQLDataException when obtaining partitions from HMS via direct SQL over Derby

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • None
    • None
    • Metastore
    • None

    Description

      In certain cases fetching the partition information from the metastore using direct SQL fails with the stack trace below.

      javax.jdo.JDODataStoreException: Error executing SQL query "select "PARTITIONS"."PART_ID" from "PARTITIONS"  inner join "TBLS" on "PARTITIONS"."TBL_ID" = "TBLS"."TBL_ID"     and "TBLS"."TBL_NAME" = ?   inner join "DBS" on "TBLS"."DB_ID" = "DBS"."DB_ID"      and "DBS"."NAME" = ? inner join "PARTITION_KEY_VALS" "FILTER0" on "FILTER0"."PART_ID" = "PARTITIONS"."PART_ID" and "FILTER0"."INTEGER_IDX" = 0 where "DBS"."CTLG_NAME" = ?  and (((case when "FILTER0"."PART_KEY_VAL" <> ? and "TBLS"."TBL_NAME" = ? and "DBS"."NAME" = ? and "DBS"."CTLG_NAME" = ? and "FILTER0"."PART_ID" = "PARTITIONS"."PART_ID" and "FILTER0"."INTEGER_IDX" = 0 then cast("FILTER0"."PART_KEY_VAL" as decimal(21,0)) else null end) = ?))".
      	at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:542) ~[datanucleus-api-jdo-5.2.4.jar:?]
      	at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:456) ~[datanucleus-api-jdo-5.2.4.jar:?]
      	at org.datanucleus.api.jdo.JDOQuery.executeWithArray(JDOQuery.java:318) ~[datanucleus-api-jdo-5.2.4.jar:?]
      	at org.apache.hadoop.hive.metastore.QueryWrapper.executeWithArray(QueryWrapper.java:137) ~[hive-standalone-metastore-server-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.metastore.MetastoreDirectSqlUtils.executeWithArray(MetastoreDirectSqlUtils.java:69) ~[hive-standalone-metastore-server-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.executeWithArray(MetaStoreDirectSql.java:2156) ~[hive-standalone-metastore-server-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.getPartitionIdsViaSqlFilter(MetaStoreDirectSql.java:894) ~[hive-standalone-metastore-server-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.getPartitionsViaSqlFilter(MetaStoreDirectSql.java:663) ~[hive-standalone-metastore-server-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.metastore.ObjectStore$11.getSqlResult(ObjectStore.java:3962) ~[hive-standalone-metastore-server-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.metastore.ObjectStore$11.getSqlResult(ObjectStore.java:3953) ~[hive-standalone-metastore-server-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.metastore.ObjectStore$GetHelper.run(ObjectStore.java:4269) ~[hive-standalone-metastore-server-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.metastore.ObjectStore.getPartitionsByExprInternal(ObjectStore.java:3989) ~[hive-standalone-metastore-server-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.metastore.VerifyingObjectStore.getPartitionsByExpr(VerifyingObjectStore.java:80) ~[hive-standalone-metastore-server-4.0.0-SNAPSHOT-tests.jar:4.0.0-SNAPSHOT]
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_261]
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_261]
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_261]
      	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_261]
      	at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:97) ~[hive-standalone-metastore-server-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at com.sun.proxy.$Proxy60.getPartitionsByExpr(Unknown Source) ~[?:?]
      	at org.apache.hadoop.hive.metastore.HMSHandler.get_partitions_spec_by_expr(HMSHandler.java:7346) ~[hive-standalone-metastore-server-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_261]
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_261]
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_261]
      	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_261]
      	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147) ~[hive-standalone-metastore-server-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:108) ~[hive-standalone-metastore-server-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at com.sun.proxy.$Proxy61.get_partitions_spec_by_expr(Unknown Source) ~[?:?]
      	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getPartitionsSpecByExprInternal(HiveMetaStoreClient.java:2238) ~[hive-standalone-metastore-common-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.metastore.HiveMetaStoreClientWithLocalCache.getPartitionsSpecByExprInternal(HiveMetaStoreClientWithLocalCache.java:389) ~[hive-standalone-metastore-common-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.getPartitionsSpecByExprInternal(SessionHiveMetaStoreClient.java:2286) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.listPartitionsSpecByExpr(HiveMetaStoreClient.java:2250) ~[hive-standalone-metastore-common-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.listPartitionsSpecByExpr(SessionHiveMetaStoreClient.java:1353) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_261]
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_261]
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_261]
      	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_261]
      	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:218) ~[hive-standalone-metastore-common-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at com.sun.proxy.$Proxy62.listPartitionsSpecByExpr(Unknown Source) ~[?:?]
      	at org.apache.hadoop.hive.ql.metadata.Hive.getPartitionsByExpr(Hive.java:4296) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.optimizer.ppr.PartitionPruner.getPartitionsFromServer(PartitionPruner.java:455) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.optimizer.ppr.PartitionPruner.prune(PartitionPruner.java:228) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.optimizer.calcite.RelOptHiveTable.computePartitionList(RelOptHiveTable.java:480) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.optimizer.calcite.rules.HivePartitionPruneRule.perform(HivePartitionPruneRule.java:63) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.optimizer.calcite.rules.HivePartitionPruneRule.onMatch(HivePartitionPruneRule.java:46) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.calcite.plan.AbstractRelOptPlanner.fireRule(AbstractRelOptPlanner.java:333) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.calcite.plan.hep.HepPlanner.applyRule(HepPlanner.java:542) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.calcite.plan.hep.HepPlanner.applyRules(HepPlanner.java:407) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.calcite.plan.hep.HepPlanner.executeInstruction(HepPlanner.java:243) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.calcite.plan.hep.HepInstruction$RuleInstance.execute(HepInstruction.java:127) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.calcite.plan.hep.HepPlanner.executeProgram(HepPlanner.java:202) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.calcite.plan.hep.HepPlanner.findBestExp(HepPlanner.java:189) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.executeProgram(CalcitePlanner.java:2458) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.executeProgram(CalcitePlanner.java:2419) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.applyPreJoinOrderingTransforms(CalcitePlanner.java:1939) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.apply(CalcitePlanner.java:1706) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.apply(CalcitePlanner.java:1583) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.calcite.tools.Frameworks.lambda$withPlanner$0(Frameworks.java:131) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.calcite.prepare.CalcitePrepareImpl.perform(CalcitePrepareImpl.java:914) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.calcite.tools.Frameworks.withPrepare(Frameworks.java:180) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.calcite.tools.Frameworks.withPlanner(Frameworks.java:126) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.parse.CalcitePlanner.logicalPlan(CalcitePlanner.java:1335) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:566) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:12584) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:459) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:317) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.Compiler.analyze(Compiler.java:224) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.Compiler.compile(Compiler.java:106) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:501) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:453) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:417) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:411) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.compileAndRespond(ReExecDriver.java:121) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:227) [hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:256) [hive-cli-4.0.0-SNAPSHOT.jar:?]
      	at org.apache.hadoop.hive.cli.CliDriver.processCmd1(CliDriver.java:201) [hive-cli-4.0.0-SNAPSHOT.jar:?]
      	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:127) [hive-cli-4.0.0-SNAPSHOT.jar:?]
      	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:422) [hive-cli-4.0.0-SNAPSHOT.jar:?]
      	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:353) [hive-cli-4.0.0-SNAPSHOT.jar:?]
      	at org.apache.hadoop.hive.ql.QTestUtil.executeClientInternal(QTestUtil.java:727) [hive-it-util-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:697) [hive-it-util-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.cli.control.CoreCliDriver.runTest(CoreCliDriver.java:114) [hive-it-util-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.cli.control.CliAdapter.runTest(CliAdapter.java:157) [hive-it-util-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver(TestMiniLlapLocalCliDriver.java:62) [test-classes/:?]
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_261]
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_261]
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_261]
      	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_261]
      	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) [junit-4.13.jar:4.13]
      	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) [junit-4.13.jar:4.13]
      	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) [junit-4.13.jar:4.13]
      	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) [junit-4.13.jar:4.13]
      	at org.apache.hadoop.hive.cli.control.CliAdapter$2$1.evaluate(CliAdapter.java:135) [hive-it-util-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) [junit-4.13.jar:4.13]
      	at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100) [junit-4.13.jar:4.13]
      	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366) [junit-4.13.jar:4.13]
      	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103) [junit-4.13.jar:4.13]
      	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63) [junit-4.13.jar:4.13]
      	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) [junit-4.13.jar:4.13]
      	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) [junit-4.13.jar:4.13]
      	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) [junit-4.13.jar:4.13]
      	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) [junit-4.13.jar:4.13]
      	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) [junit-4.13.jar:4.13]
      	at org.junit.runners.ParentRunner.run(ParentRunner.java:413) [junit-4.13.jar:4.13]
      	at org.junit.runners.Suite.runChild(Suite.java:128) [junit-4.13.jar:4.13]
      	at org.junit.runners.Suite.runChild(Suite.java:27) [junit-4.13.jar:4.13]
      	at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) [junit-4.13.jar:4.13]
      	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) [junit-4.13.jar:4.13]
      	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) [junit-4.13.jar:4.13]
      	at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) [junit-4.13.jar:4.13]
      	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) [junit-4.13.jar:4.13]
      	at org.apache.hadoop.hive.cli.control.CliAdapter$1$1.evaluate(CliAdapter.java:95) [hive-it-util-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
      	at org.junit.rules.RunRules.evaluate(RunRules.java:20) [junit-4.13.jar:4.13]
      	at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) [junit-4.13.jar:4.13]
      	at org.junit.runners.ParentRunner.run(ParentRunner.java:413) [junit-4.13.jar:4.13]
      	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365) [surefire-junit4-3.0.0-M4.jar:3.0.0-M4]
      	at org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273) [surefire-junit4-3.0.0-M4.jar:3.0.0-M4]
      	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238) [surefire-junit4-3.0.0-M4.jar:3.0.0-M4]
      	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159) [surefire-junit4-3.0.0-M4.jar:3.0.0-M4]
      	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:377) [surefire-booter-3.0.0-M4.jar:3.0.0-M4]
      	at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:138) [surefire-booter-3.0.0-M4.jar:3.0.0-M4]
      	at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:465) [surefire-booter-3.0.0-M4.jar:3.0.0-M4]
      	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:451) [surefire-booter-3.0.0-M4.jar:3.0.0-M4]
      Caused by: java.sql.SQLDataException: Invalid character string format for type DECIMAL.
      	at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at org.apache.derby.impl.jdbc.ConnectionChild.handleException(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at org.apache.derby.impl.jdbc.EmbedResultSet.closeOnTransactionError(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at org.apache.derby.impl.jdbc.EmbedResultSet.movePosition(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at org.apache.derby.impl.jdbc.EmbedResultSet.next(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at com.zaxxer.hikari.pool.HikariProxyResultSet.next(HikariProxyResultSet.java) ~[HikariCP-2.6.1.jar:?]
      	at org.datanucleus.store.rdbms.query.ForwardQueryResult.initialise(ForwardQueryResult.java:93) ~[datanucleus-rdbms-5.2.4.jar:?]
      	at org.datanucleus.store.rdbms.query.SQLQuery.performExecute(SQLQuery.java:687) ~[datanucleus-rdbms-5.2.4.jar:?]
      	at org.datanucleus.store.query.Query.executeQuery(Query.java:1975) ~[datanucleus-core-5.2.4.jar:?]
      	at org.datanucleus.store.rdbms.query.SQLQuery.executeWithArray(SQLQuery.java:818) ~[datanucleus-rdbms-5.2.4.jar:?]
      	at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:433) ~[datanucleus-api-jdo-5.2.4.jar:?]
      	... 120 more
      Caused by: org.apache.derby.iapi.error.StandardException: Invalid character string format for type DECIMAL.
      	at org.apache.derby.iapi.error.StandardException.newException(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at org.apache.derby.iapi.error.StandardException.newException(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at org.apache.derby.iapi.types.DataType.invalidFormat(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at org.apache.derby.iapi.types.DataType.setValue(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at org.apache.derby.exe.ac363c44a2x017fx07d1x9374x0000216d27b02ab.e7(Unknown Source) ~[?:?]
      	at org.apache.derby.impl.services.reflect.DirectCall.invoke(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at org.apache.derby.impl.sql.execute.ProjectRestrictResultSet.getNextRowCore(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at org.apache.derby.impl.sql.execute.NestedLoopJoinResultSet.getNextRowCore(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at org.apache.derby.impl.sql.execute.ProjectRestrictResultSet.getNextRowCore(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at org.apache.derby.impl.sql.execute.BasicNoPutResultSetImpl.getNextRow(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at org.apache.derby.impl.jdbc.EmbedResultSet.movePosition(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at org.apache.derby.impl.jdbc.EmbedResultSet.next(Unknown Source) ~[derby-10.14.1.0.jar:?]
      	at com.zaxxer.hikari.pool.HikariProxyResultSet.next(HikariProxyResultSet.java) ~[HikariCP-2.6.1.jar:?]
      	at org.datanucleus.store.rdbms.query.ForwardQueryResult.initialise(ForwardQueryResult.java:93) ~[datanucleus-rdbms-5.2.4.jar:?]
      	at org.datanucleus.store.rdbms.query.SQLQuery.performExecute(SQLQuery.java:687) ~[datanucleus-rdbms-5.2.4.jar:?]
      	at org.datanucleus.store.query.Query.executeQuery(Query.java:1975) ~[datanucleus-core-5.2.4.jar:?]
      	at org.datanucleus.store.rdbms.query.SQLQuery.executeWithArray(SQLQuery.java:818) ~[datanucleus-rdbms-5.2.4.jar:?]
      	at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:433) ~[datanucleus-api-jdo-5.2.4.jar:?]
      	... 120 more
      

      Derby fails and raises an exception when trying to execute the following query.

      SELECT "PARTITIONS"."PART_ID"
      FROM "PARTITIONS"
      INNER JOIN "TBLS" ON "PARTITIONS"."TBL_ID" = "TBLS"."TBL_ID"
      AND "TBLS"."TBL_NAME" = 'src_bucket_tbl'
      INNER JOIN "DBS" ON "TBLS"."DB_ID" = "DBS"."DB_ID"
      AND "DBS"."NAME" = 'default'
      INNER JOIN "PARTITION_KEY_VALS" "FILTER0" ON "FILTER0"."PART_ID" = "PARTITIONS"."PART_ID"
      AND "FILTER0"."INTEGER_IDX" = 0
      WHERE "DBS"."CTLG_NAME" = 'hive'
        AND (((CASE
                   WHEN "FILTER0"."PART_KEY_VAL" <> '__HIVE_DEFAULT_PARTITION__'
                        AND "TBLS"."TBL_NAME" = 'src_bucket_tbl'
                        AND "DBS"."NAME" = 'default'
                        AND "DBS"."CTLG_NAME" = 'hive'
                        AND "FILTER0"."PART_ID" = "PARTITIONS"."PART_ID"
                        AND "FILTER0"."INTEGER_IDX" = 0 THEN cast("FILTER0"."PART_KEY_VAL" AS decimal(21, 0))
                   ELSE NULL
               END) = 10))
      

      The problem can be reproduced at revision aafced6cd2b1bf31bd74c563680e02fd54cd1e01 by running the following qtests together.

      mvn test -Dtest=TestMiniLlapLocalCliDriver -Dqfile=list_bucket_dml_9.q,llap_partitioned.q,load_static_ptn_into_bucketed_table.q
      

      Attachments

        1. derby-dump.tar.gz
          327 kB
          Stamatis Zampetakis

        Issue Links

          Activity

            People

              Unassigned Unassigned
              zabetak Stamatis Zampetakis
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated: