XMLWordPrintableJSON

    Details

    • Type: Sub-task
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 3.0.0
    • Fix Version/s: 3.0.0
    • Component/s: SQL
    • Labels:
      None

      Description

      After all the discussions in the dev list: http://apache-spark-developers-list.1001551.n3.nabble.com/Discuss-Follow-ANSI-SQL-on-table-insertion-td27531.html#a27562.
      Here I propose that we can make the store assignment rules in the analyzer configurable, and the behavior of V1 and V2 should be consistent.
      When inserting a value into a column with a different data type, Spark will perform type coercion. After this PR, we support 2 policies for the type coercion rules:
      legacy and strict.
      1. With legacy policy, Spark allows casting any value to any data type and null result is returned when the conversion is invalid. The legacy policy is the only behavior in Spark 2.x and it is compatible with Hive.
      2. With strict policy, Spark doesn't allow any possible precision loss or data truncation in type coercion, e.g. `int` and `long`, `float` -> `double` are not allowed.

      Eventually, the "legacy" mode will be removed, so it is disallowed in data source V2.
      To ensure backward compatibility with existing queries, the default store assignment policy for data source V1 is "legacy" before ANSI mode is implemented.

        Attachments

          Activity

            People

            • Assignee:
              Gengliang.Wang Gengliang Wang
              Reporter:
              Gengliang.Wang Gengliang Wang
            • Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: