Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-33432

SQL parser should use active SQLConf

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.0.1
    • 3.1.0
    • SQL
    • None

    Description

      In ANSI mode, schema string parsing should fail if the schema uses ANSI reserved keyword as attribute name:

      spark.conf.set("spark.sql.ansi.enabled", "true")
      spark.sql("""select from_json('{"time":"26/10/2015"}', 'time Timestamp', map('timestampFormat', 'dd/MM/yyyy'));""").show
      
      
      output:
      
      Cannot parse the data type: 
      no viable alternative at input 'time'(line 1, pos 0)
      
      == SQL ==
      time Timestamp
      ^^^
      

      But this query may accidentally succeed in certain cases cause the DataType parser sticks to the configs of the first created session in the current thread:

      DataType.fromDDL("time Timestamp")
      val newSpark = spark.newSession()
      newSpark.conf.set("spark.sql.ansi.enabled", "true")
      newSpark.sql("""select from_json('{"time":"26/10/2015"}', 'time Timestamp', map('timestampFormat', 'dd/MM/yyyy'));""").show
      
      
      output:
      
      +--------------------------------+
      |from_json({"time":"26/10/2015"})|
      +--------------------------------+
      |            {2015-10-26 00:00...|
      +--------------------------------+
      

      Attachments

        Activity

          People

            luluorta Lu Lu
            luluorta Lu Lu
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: