Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-33306

TimezoneID is needed when there cast from Date to String

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.0.0, 3.0.1, 3.1.0
    • 3.0.2, 3.1.0
    • SQL
    • None

    Description

      A simple way to reproduce this is 

      spark-shell --conf spark.sql.legacy.typeCoercion.datetimeToString.enabled
      
      scala> sql("""
      
      select a.d1 from
       (select to_date(concat('2000-01-0', id)) as d1 from range(1, 2)) a
       join
       (select concat('2000-01-0', id) as d2 from range(1, 2)) b
       on a.d1 = b.d2
      
      """).show
      

       

      it will throw

      
      java.util.NoSuchElementException: None.get
       at scala.None$.get(Option.scala:529)
       at scala.None$.get(Option.scala:527)
       at org.apache.spark.sql.catalyst.expressions.TimeZoneAwareExpression.zoneId(datetimeExpressions.scala:56)
       at org.apache.spark.sql.catalyst.expressions.TimeZoneAwareExpression.zoneId$(datetimeExpressions.scala:56)
       at org.apache.spark.sql.catalyst.expressions.CastBase.zoneId$lzycompute(Cast.scala:253)
       at org.apache.spark.sql.catalyst.expressions.CastBase.zoneId(Cast.scala:253)
       at org.apache.spark.sql.catalyst.expressions.CastBase.dateFormatter$lzycompute(Cast.scala:287)
       at org.apache.spark.sql.catalyst.expressions.CastBase.dateFormatter(Cast.scala:287)
      
      

      Attachments

        Issue Links

          Activity

            People

              EdisonWang EdisonWang
              EdisonWang EdisonWang
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: