Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-6024

When a data source table has too many columns, it's schema cannot be stored in metastore.

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Blocker
    • Resolution: Fixed
    • None
    • 1.3.0
    • SQL
    • None

    Description

      Because we are using table properties of a Hive metastore table to store the schema, when a schema is too wide, we cannot persist it in metastore.

      15/02/25 18:13:50 ERROR metastore.RetryingHMSHandler: Retrying HMSHandler after 1000 ms (attempt 1 of 1) with error: javax.jdo.JDODataStoreException: Put request failed : INSERT INTO TABLE_PARAMS (PARAM_VALUE,TBL_ID,PARAM_KEY) VALUES (?,?,?) 
      	at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:451)
      	at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:732)
      	at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752)
      	at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:719)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:606)
      	at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)
      	at com.sun.proxy.$Proxy15.createTable(Unknown Source)
      	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1261)
      	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1294)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:606)
      	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
      	at com.sun.proxy.$Proxy16.create_table_with_environment_context(Unknown Source)
      	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:558)
      	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:606)
      	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
      	at com.sun.proxy.$Proxy17.createTable(Unknown Source)
      	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)
      	at org.apache.spark.sql.hive.HiveMetastoreCatalog.createDataSourceTable(HiveMetastoreCatalog.scala:136)
      	at org.apache.spark.sql.hive.execution.CreateMetastoreDataSourceAsSelect.run(commands.scala:243)
      	at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:55)
      	at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:55)
      	at org.apache.spark.sql.execution.ExecutedCommand.execute(commands.scala:65)
      	at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:1092)
      	at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:1092)
      	at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1013)
      	at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:963)
      	at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:929)
      	at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:907)
      	at $line39.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:25)
      	at $line39.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:30)
      	at $line39.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:32)
      	at $line39.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34)
      	at $line39.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36)
      	at $line39.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
      	at $line39.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
      	at $line39.$read$$iwC$$iwC$$iwC.<init>(<console>:42)
      	at $line39.$read$$iwC$$iwC.<init>(<console>:44)
      	at $line39.$read$$iwC.<init>(<console>:46)
      	at $line39.$read.<init>(<console>:48)
      	at $line39.$read$.<init>(<console>:52)
      	at $line39.$read$.<clinit>(<console>)
      	at $line39.$eval$.<init>(<console>:7)
      	at $line39.$eval$.<clinit>(<console>)
      	at $line39.$eval.$print(<console>)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:606)
      	at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
      	at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
      	at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
      	at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
      	at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
      	at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
      	at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
      	at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
      	at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
      	at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
      	at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
      	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
      	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
      	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
      	at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
      	at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
      	at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
      	at org.apache.spark.repl.Main$.main(Main.scala:31)
      	at org.apache.spark.repl.Main.main(Main.scala)
      NestedThrowablesStackTrace:
      org.datanucleus.store.rdbms.exceptions.MappedDatastoreException: INSERT INTO TABLE_PARAMS (PARAM_VALUE,TBL_ID,PARAM_KEY) VALUES (?,?,?) 
      	at org.datanucleus.store.rdbms.scostore.JoinMapStore.internalPut(JoinMapStore.java:1078)
      	at org.datanucleus.store.rdbms.scostore.JoinMapStore.putAll(JoinMapStore.java:220)
      	at org.datanucleus.store.rdbms.mapping.java.MapMapping.postInsert(MapMapping.java:137)
      	at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:519)
      	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDBMSPersistenceHandler.java:167)
      	at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:143)
      	at org.datanucleus.state.JDOStateManager.internalMakePersistent(JDOStateManager.java:3784)
      	at org.datanucleus.state.JDOStateManager.makePersistent(JDOStateManager.java:3760)
      	at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2219)
      	at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:2065)
      	at org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:1913)
      	at org.datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionContextThreadedImpl.java:217)
      	at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:727)
      	at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752)
      	at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:719)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:606)
      	at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)
      	at com.sun.proxy.$Proxy15.createTable(Unknown Source)
      	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1261)
      	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1294)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:606)
      	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
      	at com.sun.proxy.$Proxy16.create_table_with_environment_context(Unknown Source)
      	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:558)
      	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:606)
      	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
      	at com.sun.proxy.$Proxy17.createTable(Unknown Source)
      	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)
      	at org.apache.spark.sql.hive.HiveMetastoreCatalog.createDataSourceTable(HiveMetastoreCatalog.scala:136)
      	at org.apache.spark.sql.hive.execution.CreateMetastoreDataSourceAsSelect.run(commands.scala:243)
      	at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:55)
      	at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:55)
      	at org.apache.spark.sql.execution.ExecutedCommand.execute(commands.scala:65)
      	at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:1092)
      	at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:1092)
      	at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1013)
      	at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:963)
      	at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:929)
      	at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:907)
      	at $line39.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:25)
      	at $line39.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:30)
      	at $line39.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:32)
      	at $line39.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34)
      	at $line39.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36)
      	at $line39.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
      	at $line39.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
      	at $line39.$read$$iwC$$iwC$$iwC.<init>(<console>:42)
      	at $line39.$read$$iwC$$iwC.<init>(<console>:44)
      	at $line39.$read$$iwC.<init>(<console>:46)
      	at $line39.$read.<init>(<console>:48)
      	at $line39.$read$.<init>(<console>:52)
      	at $line39.$read$.<clinit>(<console>)
      	at $line39.$eval$.<init>(<console>:7)
      	at $line39.$eval$.<clinit>(<console>)
      	at $line39.$eval.$print(<console>)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:606)
      	at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
      	at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
      	at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
      	at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
      	at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
      	at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
      	at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
      	at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
      	at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
      	at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
      	at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
      	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
      	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
      	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
      	at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
      	at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
      	at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
      	at org.apache.spark.repl.Main$.main(Main.scala:31)
      	at org.apache.spark.repl.Main.main(Main.scala)
      Caused by: java.sql.SQLDataException: A truncation error was encountered trying to shrink VARCHAR '{"type":"struct","fields":[{"name":"contributors","type":"st&' to length 4000.
      	at org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
      	at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown Source)
      	at org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown Source)
      	at org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown Source)
      	at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown Source)
      	at org.apache.derby.impl.jdbc.ConnectionChild.handleException(Unknown Source)
      	at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source)
      	at org.apache.derby.impl.jdbc.EmbedPreparedStatement.executeStatement(Unknown Source)
      	at org.apache.derby.impl.jdbc.EmbedPreparedStatement.executeLargeUpdate(Unknown Source)
      	at org.apache.derby.impl.jdbc.EmbedPreparedStatement.executeUpdate(Unknown Source)
      	at com.jolbox.bonecp.PreparedStatementHandle.executeUpdate(PreparedStatementHandle.java:205)
      	at org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeUpdate(ParamLoggingPreparedStatement.java:399)
      	at org.datanucleus.store.rdbms.SQLController.executeStatementUpdate(SQLController.java:439)
      	at org.datanucleus.store.rdbms.scostore.JoinMapStore.internalPut(JoinMapStore.java:1069)
      	... 87 more
      Caused by: java.sql.SQLException: A truncation error was encountered trying to shrink VARCHAR '{"type":"struct","fields":[{"name":"contributors","type":"st&' to length 4000.
      	at org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
      	at org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown Source)
      	... 101 more
      Caused by: ERROR 22001: A truncation error was encountered trying to shrink VARCHAR '{"type":"struct","fields":[{"name":"contributors","type":"st&' to length 4000.
      	at org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
      	at org.apache.derby.iapi.types.SQLChar.hasNonBlankChars(Unknown Source)
      	at org.apache.derby.iapi.types.SQLVarchar.normalize(Unknown Source)
      	at org.apache.derby.iapi.types.SQLVarchar.normalize(Unknown Source)
      	at org.apache.derby.iapi.types.DataTypeDescriptor.normalize(Unknown Source)
      	at org.apache.derby.impl.sql.execute.NormalizeResultSet.normalizeColumn(Unknown Source)
      	at org.apache.derby.impl.sql.execute.NormalizeResultSet.normalizeRow(Unknown Source)
      	at org.apache.derby.impl.sql.execute.NormalizeResultSet.getNextRowCore(Unknown Source)
      	at org.apache.derby.impl.sql.execute.DMLWriteResultSet.getNextRowCore(Unknown Source)
      	at org.apache.derby.impl.sql.execute.InsertResultSet.open(Unknown Source)
      	at org.apache.derby.impl.sql.GenericPreparedStatement.executeStmt(Unknown Source)
      	at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source)
      	... 95 more
      

      Attachments

        Activity

          People

            yhuai Yin Huai
            yhuai Yin Huai
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: