Description
A table was created using HCatalog API with out specifying the file format, it defaults to:
fileFormat=TextFile, inputformat=org.apache.hadoop.mapred.TextInputFormat, outputformat=org.apache.hadoop.hive.ql.io.IgnoreKeyTextOutputFormat
But, when hive fetches the table from the metastore, it strangely replaces the output format with org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat
and the comparison between source and target table fails.
The code in org.apache.hadoop.hive.ql.parse.ImportSemanticAnalyzer#checkTable does a string comparison of classes and fails.
// check IF/OF/Serde String existingifc = table.getInputFormatClass().getName(); String importedifc = tableDesc.getInputFormat(); String existingofc = table.getOutputFormatClass().getName(); String importedofc = tableDesc.getOutputFormat(); if ((!existingifc.equals(importedifc)) || (!existingofc.equals(importedofc))) { throw new SemanticException( ErrorMsg.INCOMPATIBLE_SCHEMA .getMsg(" Table inputformat/outputformats do not match")); }
This only affects tables with text and sequence file formats but not rc or orc.
Attachments
Attachments
Issue Links
- is cloned by
-
HIVE-7472 CLONE - Import fails for tables created with default text, sequence and orc file formats using HCatalog API
- Closed