Description
I have a old table that was created without providing a schema. Seems branch 2.1 fail to load it and says that the schema is corrupt.
With spark.sql.debug enabled, I get the metadata by using describe formatted.
[col,array<string>,from deserializer] [,,] [# Detailed Table Information,,] [Database:,mydb,] [Owner:,root,] [Create Time:,Fri Jun 17 11:55:07 UTC 2016,] [Last Access Time:,Thu Jan 01 00:00:00 UTC 1970,] [Location:,mylocation,] [Table Type:,EXTERNAL,] [Table Parameters:,,] [ transient_lastDdlTime,1466164507,] [ spark.sql.sources.provider,parquet,] [,,] [# Storage Information,,] [SerDe Library:,org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe,] [InputFormat:,org.apache.hadoop.mapred.SequenceFileInputFormat,] [OutputFormat:,org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat,] [Compressed:,No,] [Storage Desc Parameters:,,] [ path,/myPatch,] [ serialization.format,1,]