Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-18464

Spark SQL fails to load tables created without providing a schema

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Blocker
    • Resolution: Fixed
    • 2.1.0
    • 2.1.0
    • SQL
    • None

    Description

      I have a old table that was created without providing a schema. Seems branch 2.1 fail to load it and says that the schema is corrupt.

      With spark.sql.debug enabled, I get the metadata by using describe formatted.

      [col,array<string>,from deserializer]
      [,,]
      [# Detailed Table Information,,]
      [Database:,mydb,]
      [Owner:,root,]
      [Create Time:,Fri Jun 17 11:55:07 UTC 2016,]
      [Last Access Time:,Thu Jan 01 00:00:00 UTC 1970,]
      [Location:,mylocation,]
      [Table Type:,EXTERNAL,]
      [Table Parameters:,,]
      [  transient_lastDdlTime,1466164507,]
      [  spark.sql.sources.provider,parquet,]
      [,,]
      [# Storage Information,,]
      [SerDe Library:,org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe,]
      [InputFormat:,org.apache.hadoop.mapred.SequenceFileInputFormat,]
      [OutputFormat:,org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat,]
      [Compressed:,No,]
      [Storage Desc Parameters:,,]
      [  path,/myPatch,]
      [  serialization.format,1,]
      

      Attachments

        Activity

          People

            cloud_fan Wenchen Fan
            yhuai Yin Huai
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: