Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-33588

Partition spec in SHOW TABLE EXTENDED doesn't respect `spark.sql.caseSensitive`

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.4.7, 3.0.1, 3.1.0
    • 2.4.8, 3.0.2, 3.1.0
    • SQL
    • None

    Description

      For example:

      spark-sql> CREATE TABLE tbl1 (price int, qty int, year int, month int)
               > USING parquet
               > partitioned by (year, month);
      spark-sql> INSERT INTO tbl1 PARTITION(year = 2015, month = 1) SELECT 1, 1;
      spark-sql> SHOW TABLE EXTENDED LIKE 'tbl1' PARTITION(YEAR = 2015, Month = 1);
      Error in query: Partition spec is invalid. The spec (YEAR, Month) must match the partition spec (year, month) defined in table '`default`.`tbl1`';
      

      The spark.sql.caseSensitive flag is false by default, so, the partition spec is valid.

      Attachments

        Activity

          People

            maxgekk Max Gekk
            maxgekk Max Gekk
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: