Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-39354

The analysis exception is incorrect

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 3.3.0
    • 3.3.0, 3.4.0
    • SQL
    • None

    Description

      scala> spark.sql("create table t1(user_id int, auct_end_dt date) using parquet;")
      res0: org.apache.spark.sql.DataFrame = []
      
      scala> spark.sql("select * from t1 join t2 on t1.user_id = t2.user_id where t1.auct_end_dt >= Date_sub('2020-12-27', 90)").show
      org.apache.spark.sql.AnalysisException: cannot resolve 'date_sub('2020-12-27', 90)' due to data type mismatch: argument 1 requires date type, however, ''2020-12-27'' is of string type.; line 1 pos 76
        at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
        at org.apache.spark.sql.catalyst.analysis.RemoveTempResolvedColumn$.$anonfun$apply$82(Analyzer.scala:4334)
        at org.apache.spark.sql.catalyst.analysis.RemoveTempResolvedColumn$.$anonfun$apply$82$adapted(Analyzer.scala:4327)
        at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:365)
        at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreachUp$1(TreeNode.scala:364)
        at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$foreachUp$1$adapted(TreeNode.scala:364)
      

      The analysis exception should be:

      org.apache.spark.sql.AnalysisException: Table or view not found: t2
      

      Attachments

        Activity

          People

            LuciferYang Yang Jie
            yumwang Yuming Wang
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: