Uploaded image for project: 'IMPALA'
  1. IMPALA
  2. IMPALA-3622

Unacceptable '\0' used as fields terminator when creating tables

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Open
    • Major
    • Resolution: Unresolved
    • Impala 2.6.0
    • None
    • Catalog
    • Server version: impalad version 2.6.0-cdh5-INTERNAL DEBUG (build 3bd8b45ae89659fe8d0d29cd40d6b1e68e84c393)

    Description

      According to the official documentation, in IMPALA 1.3.1 and higher, the delimiter character '\0' is legal to use the ASCII 0 (null) character for text tables, and a SQL sample is provided. But it fails in practice, running with impala-shell. So should we regard it as a bug? Also I tried '\u0000'.
      More details, see [reference] http://www.cloudera.com/documentation/enterprise/latest/topics/impala_txtfile.html

      [ SQL ]:
      create table nul_separated(id int, s string, n int, t timestamp, b boolean)
      row format delimited
      fields terminated by '\0'
      stored as textfile;

      [ ERROR ]:
      ImpalaRuntimeException: Error making 'createTable' RPC to Hive Metastore:
      CAUSED BY: MetaException: javax.jdo.JDODataStoreException: Put request failed : INSERT INTO "SERDE_PARAMS" ("PARAM_VALUE","SERDE_ID","PARAM_KEY") VALUES (?,?,?)
      at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:451)
      at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:732)
      at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752)
      at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:902)
      at sun.reflect.GeneratedMethodAccessor36.invoke(Unknown Source)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      at java.lang.reflect.Method.invoke(Method.java:497)
      at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114)
      at com.sun.proxy.$Proxy2.createTable(Unknown Source)
      at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1466)
      at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1499)
      at sun.reflect.GeneratedMethodAccessor32.invoke(Unknown Source)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      at java.lang.reflect.Method.invoke(Method.java:497)
      at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:138)
      at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:99)
      at com.sun.proxy.$Proxy4.create_table_with_environment_context(Unknown Source)
      at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_table_with_environment_context.getResult(ThriftHiveMetastore.java:9207)
      at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_table_with_environment_context.getResult(ThriftHiveMetastore.java:9191)
      at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
      at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
      at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106)
      at java.security.AccessController.doPrivileged(Native Method)
      at javax.security.auth.Subject.doAs(Subject.java:422)
      at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
      at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
      at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      at java.lang.Thread.run(Thread.java:745)
      NestedThrowablesStackTrace:
      org.datanucleus.store.rdbms.exceptions.MappedDatastoreException: INSERT INTO "SERDE_PARAMS" ("PARAM_VALUE","SERDE_ID","PARAM_KEY") VALUES (?,?,?)
      at org.datanucleus.store.rdbms.scostore.JoinMapStore.internalPut(JoinMapStore.java:1078)
      at org.datanucleus.store.rdbms.scostore.JoinMapStore.putAll(JoinMapStore.java:220)
      at org.datanucleus.store.rdbms.mapping.java.MapMapping.postInsert(MapMapping.java:137)
      at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:519)
      at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDBMSPersistenceHandler.java:167)
      at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:143)
      at org.datanucleus.state.JDOStateManager.internalMakePersistent(JDOStateManager.java:3784)
      at org.datanucleus.state.JDOStateManager.makePersistent(JDOStateManager.java:3760)
      at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2219)
      at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2314)
      at org.datanucleus.store.rdbms.mapping.java.PersistableMapping.setObjectAsValue(PersistableMapping.java:567)
      at org.datanucleus.store.rdbms.mapping.java.PersistableMapping.setObject(PersistableMapping.java:326)
      at org.datanucleus.store.rdbms.fieldmanager.ParameterSetter.storeObjectField(ParameterSetter.java:193)
      at org.datanucleus.state.JDOStateManager.providedObjectField(JDOStateManager.java:1269)
      at org.apache.hadoop.hive.metastore.model.MStorageDescriptor.jdoProvideField(MStorageDescriptor.java)
      at org.apache.hadoop.hive.metastore.model.MStorageDescriptor.jdoProvideFields(MStorageDescriptor.java)
      at org.datanucleus.state.JDOStateManager.provideFields(JDOStateManager.java:1346)
      at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:289)
      at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDBMSPersistenceHandler.java:167)
      at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:143)
      at org.datanucleus.state.JDOStateManager.internalMakePersistent(JDOStateManager.java:3784)
      at org.datanucleus.state.JDOStateManager.makePersistent(JDOStateManager.java:3760)
      at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2219)
      at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2314)
      at org.datanucleus.store.rdbms.mapping.java.PersistableMapping.setObjectAsValue(PersistableMapping.java:567)
      at org.datanucleus.store.rdbms.mapping.java.PersistableMapping.setObject(PersistableMapping.java:326)
      at org.datanucleus.store.rdbms.fieldmanager.ParameterSetter.storeObjectField(ParameterSetter.java:193)
      at org.datanucleus.state.JDOStateManager.providedObjectField(JDOStateManager.java:1269)
      at org.apache.hadoop.hive.metastore.model.MTable.jdoProvideField(MTable.java)
      at org.apache.hadoop.hive.metastore.model.MTable.jdoProvideFields(MTable.java)
      at org.datanucleus.state.JDOStateManager.provideFields(JDOStateManager.java:1346)
      at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:289)
      at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertTable(RDBMSPersistenceHandler.java:167)
      at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:143)
      at org.datanucleus.state.JDOStateManager.internalMakePersistent(JDOStateManager.java:3784)
      at org.datanucleus.state.JDOStateManager.makePersistent(JDOStateManager.java:3760)
      at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2219)
      at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:2065)
      at org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:1913)
      at org.datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionContextThreadedImpl.java:217)
      at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:727)
      at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:752)
      at org.apache.hadoop.hive.metastore.ObjectStore.createTable(ObjectStore.java:902)
      at sun.reflect.GeneratedMethodAccessor36.invoke(Unknown Source)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      at java.lang.reflect.Method.invoke(Method.java:497)
      at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114)
      at com.sun.proxy.$Proxy2.createTable(Unknown Source)
      at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1466)
      at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1499)
      at sun.reflect.GeneratedMethodAccessor32.invoke(Unknown Source)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      at java.lang.reflect.Method.invoke(Method.java:497)
      at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:138)
      at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:99)
      at com.sun.proxy.$Proxy4.create_table_with_environment_context(Unknown Source)
      at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_table_with_environment_context.getResult(ThriftHiveMetastore.java:9207)
      at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_table_with_environment_context.getResult(ThriftHiveMetastore.java:9191)
      at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
      at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:110)
      at org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:106)
      at java.security.AccessController.doPrivileged(Native Method)
      at javax.security.auth.Subject.doAs(Subject.java:422)
      at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
      at org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:118)
      at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      at java.lang.Thread.run(Thread.java:745)
      Caused by: org.postgresql.util.PSQLException: ERROR: invalid byte sequence for encoding "UTF8": 0x00
      Hint: This error can also happen if the byte sequence does not match the encoding expected by the server, which is controlled by "client_encoding".
      at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2102)
      at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1835)
      at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:257)
      at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:500)
      at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:388)
      at org.postgresql.jdbc2.AbstractJdbc2Statement.executeUpdate(AbstractJdbc2Statement.java:334)
      at com.jolbox.bonecp.PreparedStatementHandle.executeUpdate(PreparedStatementHandle.java:205)
      at org.datanucleus.store.rdbms.ParamLoggingPreparedStatement.executeUpdate(ParamLoggingPreparedStatement.java:399)
      at org.datanucleus.store.rdbms.SQLController.executeStatementUpdate(SQLController.java:439)
      at org.datanucleus.store.rdbms.scostore.JoinMapStore.internalPut(JoinMapStore.java:1069)

      Attachments

        Activity

          People

            Unassigned Unassigned
            hayabusa Youwei Wang
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated: