Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-35030

ANSI SQL compliance

    XMLWordPrintableJSON

Details

    • Epic
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.0.0, 3.1.1, 3.2.0
    • None
    • SQL
    • None
    • ANSI SQL compliance

    Description

      Build an ANSI compliant dialect in Spark, for better data quality and easier migration from traditional DBMS to Spark. For example, Spark will throw an exception at runtime instead of returning null results when the inputs to a SQL operator/function are invalid.

      The new dialect is controlled by SQL Configuration `spark.sql.ansi.enabled`:

      -- `spark.sql.ansi.enabled=true`
      SELECT 2147483647 + 1;
      java.lang.ArithmeticException: integer overflow
      
      -- `spark.sql.ansi.enabled=false`
      SELECT 2147483647 + 1;
      +----------------+
      |(2147483647 + 1)|
      +----------------+
      |     -2147483648|
      +----------------+
      

      Full details of this dialect are documented in https://spark.apache.org/docs/latest/sql-ref-ansi-compliance.html.

      Note that some ANSI dialect features maybe not from the ANSI SQL standard directly, but their behaviors align with ANSI SQL's style.

      Attachments

        Issue Links

          Activity

            People

              apachespark Apache Spark
              Gengliang.Wang Gengliang Wang
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: