Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-21997

Spark shows different results on char/varchar columns on Parquet

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Reopened
    • Major
    • Resolution: Unresolved
    • 2.0.2, 2.1.1, 2.2.0, 2.3.4, 2.4.4, 3.0.0
    • None
    • SQL
    • None

    Description

      SPARK-19459 resolves CHAR/VARCHAR issues in general, but Spark shows different results according to the SQL configuration, spark.sql.hive.convertMetastoreParquet. We had better fix this. Actually, the default of `spark.sql.hive.convertMetastoreParquet` is true, so the result is wrong by default.

      scala> sql("CREATE TABLE t_char(a CHAR(10), b VARCHAR(10)) STORED AS parquet")
      scala> sql("INSERT INTO TABLE t_char SELECT 'a', 'b'")
      scala> sql("SELECT * FROM t_char").show
      +---+---+
      |  a|  b|
      +---+---+
      |  a|  b|
      +---+---+
      
      scala> sql("set spark.sql.hive.convertMetastoreParquet=false")
      
      scala> sql("SELECT * FROM t_char").show
      +----------+---+
      |         a|  b|
      +----------+---+
      |a         |  b|
      +----------+---+
      

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              dongjoon Dongjoon Hyun
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated: