Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-35531

Can not insert into hive bucket table if create table with upper case schema

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.0.0, 3.1.1, 3.2.0
    • 3.3.0, 3.1.4
    • SQL
    • None

    Description

       

       

      create table TEST1(
       V1 BIGINT,
       S1 INT)
       partitioned by (PK BIGINT)
       clustered by (V1)
       sorted by (S1)
       into 200 buckets
       STORED AS PARQUET;

       

      insert into test1
       select
       * from values(1,1,1);

       

       

      org.apache.hadoop.hive.ql.metadata.HiveException: Bucket columns V1 is not part of the table columns ([FieldSchema(name:v1, type:bigint, comment:null), FieldSchema(name:s1, type:int, comment:null)]
      org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Bucket columns V1 is not part of the table columns ([FieldSchema(name:v1, type:bigint, comment:null), FieldSchema(name:s1, type:int, comment:null)]

      Attachments

        Activity

          People

            angerszhuuu angerszhu
            opensky Hongyi Zhang
            Votes:
            0 Vote for this issue
            Watchers:
            8 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: