Details

    • Sub-task
    • Status: Closed
    • Major
    • Resolution: Won't Fix
    • 3.0.0
    • None
    • SQL
    • None

    Description

      Execute below SQL in Spark, the result is "abcdef".   But the result of other DBMS is "abc"(I think this is more sensible).

      select cast("abcdef" as char(3));
      

      And then I checked the source code, seems char/varchar only be used in DDL parse.

      /**
       * Hive char type. Similar to other HiveStringType's, these datatypes should only used for
       * parsing, and should NOT be used anywhere else. Any instance of these data types should be
       * replaced by a [[StringType]] before analysis.
       */
      case class CharType(length: Int) extends HiveStringType {
        override def simpleString: String = s"char($length)"
      }
      
      /**
       * Hive varchar type. Similar to other HiveStringType's, these datatypes should only used for
       * parsing, and should NOT be used anywhere else. Any instance of these data types should be
       * replaced by a [[StringType]] before analysis.
       */
      case class VarcharType(length: Int) extends HiveStringType {
        override def simpleString: String = s"varchar($length)"
      }
      
      

      Is this behavior expected? 

      Attachments

        Activity

          People

            Unassigned Unassigned
            lipzhu Zhu, Lipeng
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: