Hive
  1. Hive
  2. HIVE-1435

Upgraded naming scheme causes JDO exceptions

    Details

    • Type: Bug Bug
    • Status: Closed
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 0.6.0
    • Component/s: Metastore
    • Labels:
      None
    • Hadoop Flags:
      Reviewed

      Description

      We recently upgraded from Datanucleus 1.0 to 2.0, which changed some of the defaults for how field names get mapped to datastore identifiers. Because of this change, connecting to an existing database would throw exceptions such as:

      2010-06-24 17:59:09,854 ERROR exec.DDLTask (SessionState.java:printError(277)) - FAILED: Error in metadata: javax.jdo.JDODataStoreException: Insert of object "org.apache.hadoop.hive.metastore.model.MStorageDescriptor@4ccd21c" using statement "INSERT INTO `SDS` (`SD_ID`,`NUM_BUCKETS`,`INPUT_FORMAT`,`OUTPUT_FORMAT`,`LOCATION`,`SERDE_ID`,`ISCOMPRESSED`) VALUES (?,?,?,?,?,?,?)" failed : Unknown column 'ISCOMPRESSED' in 'field list'
      NestedThrowables:
      com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'ISCOMPRESSED' in 'field list'
      org.apache.hadoop.hive.ql.metadata.HiveException: javax.jdo.JDODataStoreException: Insert of object "org.apache.hadoop.hive.metastore.model.MStorageDescriptor@4ccd21c" using statement "INSERT INTO `SDS` (`SD_ID`,`NUM_BUCKETS`,`INPUT_FORMAT`,`OUTPUT_FORMAT`,`LOCATION`,`SERDE_ID`,`ISCOMPRESSED`) VALUES (?,?,?,?,?,?,?)" failed : Unknown column 'ISCOMPRESSED' in 'field list'
      NestedThrowables:
      com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'ISCOMPRESSED' in 'field list'
      at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:325)
      at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:2012)
      at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:144)
      at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:107)
      at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:55)
      at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:633)
      at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:506)
      at org.apache.hadoop.hive.ql.Driver.run(Driver.java:384)
      at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:138)
      at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:197)
      at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:302)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
      at java.lang.reflect.Method.invoke(Method.java:597)
      at org.apache.hadoop.util.RunJar.main(RunJar.java:156)

        Issue Links

          Activity

          Paul Yang created issue -
          Paul Yang made changes -
          Field Original Value New Value
          Description We recently upgraded from Datanucleus 1.0 to 2.0, which changed some of the defaults for how field names get mapped to datastore identifiers. Because of this change, connecting to an existing database would throw exceptions such as:

          {code}
          2010-06-24 17:59:09,854 ERROR exec.DDLTask (SessionState.java:printError(277)) - FAILED: Error in metadata: javax.jdo.JDODataStoreException: Insert of object "org.apache.hadoop.hive.metastore.model.MStorageDescriptor@4ccd21c" using statement "INSERT INTO `SDS` (`SD_ID`,`NUM_BUCKETS`,`INPUT_FORMAT`,`OUTPUT_FORMAT`,`LOCATION`,`SERDE_ID`,`ISCOMPRESSED`) VALUES (?,?,?,?,?,?,?)" failed : Unknown column 'ISCOMPRESSED' in 'field list'
          NestedThrowables:
          com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'ISCOMPRESSED' in 'field list'
          org.apache.hadoop.hive.ql.metadata.HiveException: javax.jdo.JDODataStoreException: Insert of object "org.apache.hadoop.hive.metastore.model.MStorageDescriptor@4ccd21c" using statement "INSERT INTO `SDS` (`SD_ID`,`NUM_BUCKETS`,`INPUT_FORMAT`,`OUTPUT_FORMAT`,`LOCATION`,`SERDE_ID`,`ISCOMPRESSED`) VALUES (?,?,?,?,?,?,?)" failed : Unknown column 'ISCOMPRESSED' in 'field list'
          NestedThrowables:
          com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'ISCOMPRESSED' in 'field list'
                  at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:325)
                  at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:2012)
                  at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:144)
                  at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:107)
                  at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:55)
                  at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:633)
                  at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:506)
                  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:384)
                  at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:138)
                  at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:197)
                  at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:302)
                  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
                  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
                  at java.lang.reflect.Method.invoke(Method.java:597)
                  at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
          {code}
          We recently upgraded from Datanucleus 1.0 to 2.0, which changed some of the defaults for how field names get mapped to datastore identifiers. Because of this change, connecting to an existing database would throw exceptions such as:

          2010-06-24 17:59:09,854 ERROR exec.DDLTask (SessionState.java:printError(277)) - FAILED: Error in metadata: javax.jdo.JDODataStoreException: Insert of object "org.apache.hadoop.hive.metastore.model.MStorageDescriptor@4ccd21c" using statement "INSERT INTO `SDS` (`SD_ID`,`NUM_BUCKETS`,`INPUT_FORMAT`,`OUTPUT_FORMAT`,`LOCATION`,`SERDE_ID`,`ISCOMPRESSED`) VALUES (?,?,?,?,?,?,?)" failed : Unknown column 'ISCOMPRESSED' in 'field list'
          NestedThrowables:
          com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'ISCOMPRESSED' in 'field list'
          org.apache.hadoop.hive.ql.metadata.HiveException: javax.jdo.JDODataStoreException: Insert of object "org.apache.hadoop.hive.metastore.model.MStorageDescriptor@4ccd21c" using statement "INSERT INTO `SDS` (`SD_ID`,`NUM_BUCKETS`,`INPUT_FORMAT`,`OUTPUT_FORMAT`,`LOCATION`,`SERDE_ID`,`ISCOMPRESSED`) VALUES (?,?,?,?,?,?,?)" failed : Unknown column 'ISCOMPRESSED' in 'field list'
          NestedThrowables:
          com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'ISCOMPRESSED' in 'field list'
                  at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:325)
                  at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:2012)
                  at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:144)
                  at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:107)
                  at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:55)
                  at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:633)
                  at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:506)
                  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:384)
                  at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:138)
                  at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:197)
                  at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:302)
                  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
                  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
                  at java.lang.reflect.Method.invoke(Method.java:597)
                  at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
          Hide
          Paul Yang added a comment -

          This reverts the identifier factory to the same as datanucleus 1.1.

          Show
          Paul Yang added a comment - This reverts the identifier factory to the same as datanucleus 1.1.
          Paul Yang made changes -
          Attachment HIVE-1435.1.patch [ 12448011 ]
          Hide
          Paul Yang added a comment -

          This patch was tested with a metastore database created prior to the datanucleus upgrade. In our configuration, we have

          <property>
            <name>datanucleus.autoCreateSchema</name>
            <value>false</value>
          </property>
          <property>
            <name>datanucleus.fixedDatastore</name>
            <value>true</value>
          </property>
          

          so that no automatic schema changes will occur. The following commands were run without errors: create table, add partition, create view, create view, drop table, and drop view.

          Show
          Paul Yang added a comment - This patch was tested with a metastore database created prior to the datanucleus upgrade. In our configuration, we have <property> <name>datanucleus.autoCreateSchema</name> <value> false </value> </property> <property> <name>datanucleus.fixedDatastore</name> <value> true </value> </property> so that no automatic schema changes will occur. The following commands were run without errors: create table, add partition, create view, create view, drop table, and drop view.
          Paul Yang made changes -
          Status Open [ 1 ] Patch Available [ 10002 ]
          Hide
          John Sichi added a comment -

          +1.

          Show
          John Sichi added a comment - +1.
          Hide
          John Sichi added a comment -

          Committed. Thanks Paul!

          Show
          John Sichi added a comment - Committed. Thanks Paul!
          John Sichi made changes -
          Fix Version/s 0.6.0 [ 12314524 ]
          Fix Version/s 0.7.0 [ 12315150 ]
          John Sichi made changes -
          Resolution Fixed [ 1 ]
          Status Patch Available [ 10002 ] Resolved [ 5 ]
          Hadoop Flags [Reviewed]
          Carl Steinbach made changes -
          Fix Version/s 0.7.0 [ 12315150 ]
          Affects Version/s 0.6.0 [ 12314524 ]
          Affects Version/s 0.7.0 [ 12315150 ]
          Carl Steinbach made changes -
          Link This issue relates to HIVE-2059 [ HIVE-2059 ]
          Carl Steinbach made changes -
          Status Resolved [ 5 ] Closed [ 6 ]

            People

            • Assignee:
              Paul Yang
              Reporter:
              Paul Yang
            • Votes:
              0 Vote for this issue
              Watchers:
              0 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development