Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-17753

Simple case in spark sql throws ParseException

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.0.0
    • 2.0.2, 2.1.0
    • SQL
    • None

    Description

      Simple case in sql throws parser exception in spark 2.0.
      The following query as well as similar queries fail in spark 2.0

      scala> spark.sql("SELECT alias.p_double as a0, alias.p_text as a1, NULL as a2 FROM hadoop_tbl_all alias WHERE  (1 = (CASE ('aaaaabbbbb' = alias.p_text) OR (8 LTE LENGTH(alias.p_text)) WHEN TRUE THEN 1  WHEN FALSE THEN 0 ELSE CAST(NULL AS INT) END))")
      org.apache.spark.sql.catalyst.parser.ParseException:
      mismatched input 'FROM' expecting {<EOF>, 'WHERE', 'GROUP', 'ORDER', 'HAVING', 'LIMIT', 'LATERAL', 'WINDOW', 'UNION', 'EXCEPT', 'INTERSECT', 'SORT', 'CLUSTER', 'DISTRIBUTE'}(line 1, pos 60)
      
      == SQL ==
      SELECT alias.p_double as a0, alias.p_text as a1, NULL as a2 FROM hadoop_tbl_all alias WHERE  (1 = (CASE ('aaaaabbbbb' = alias.p_text) OR (8 LTE LENGTH(alias.p_text)) WHEN TRUE THEN 1  WHEN FALSE THEN 0 ELSE CAST(NULL AS INT) END))
      ------------------------------------------------------------^^^
      
        at org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:197)
        at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:99)
        at org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:46)
        at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:53)
        at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:582)
        ... 48 elided
      

      Attachments

        Activity

          People

            hvanhovell Herman van Hövell
            kdhuria kanika dhuria
            Votes:
            1 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: