Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-34246

New type coercion syntax rules in ANSI mode

    XMLWordPrintableJSON

Details

    • New Feature
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.1.2
    • 3.2.0
    • SQL
    • None

    Description

      Add new implicit cast syntax rules in ANSI mode.
      In Spark ANSI mode, the type coercion rules are based on the type precedence lists of the input data types.
      As per the section "Type precedence list determination" of "ISO/IEC 9075-2:2011
      Information technology — Database languages - SQL — Part 2: Foundation (SQL/Foundation)", the type precedence lists of primitive
      data types are as following:

      • Byte: Byte, Short, Int, Long, Decimal, Float, Double
      • Short: Short, Int, Long, Decimal, Float, Double
      • Int: Int, Long, Decimal, Float, Double
      • Long: Long, Decimal, Float, Double
      • Decimal: Any wider Numeric type
      • Float: Float, Double
      • Double: Double
      • String: String
      • Date: Date, Timestamp
      • Timestamp: Timestamp
      • Binary: Binary
      • Boolean: Boolean
      • Interval: Interval
        As for complex data types, Spark will determine the precedent list recursively based on their sub-types.

      With the definition of type precedent list, the general type coercion rules are as following:

      • Data type S is allowed to be implicitly cast as type T iff T is in the precedence list of S
      • Comparison is allowed iff the data type precedence list of both sides has at least one common element. When evaluating the comparison, Spark casts both sides as the tightest common data type of their precedent lists.
      • There should be at least one common data type among all the children's precedence lists for the following operators. The data type of the operator is the tightest common precedent data type.
      In
      Except(odd)
      Intersect
      Greatest
      Least
      Union
      If
      CaseWhen
      CreateArray
      Array Concat
      Sequence
      MapConcat
      CreateMap
      
      • For complex types (struct, array, map), Spark recursively looks into the element type and applies the rules above. If the element nullability is converted from true to false, add runtime null check to the elements.

      Attachments

        Issue Links

          There are no Sub-Tasks for this issue.

          Activity

            People

              Gengliang.Wang Gengliang Wang
              Gengliang.Wang Gengliang Wang
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: