Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-8420

Inconsistent behavior with Dataframe Timestamp between 1.3.1 and 1.4.0

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Blocker
    • Resolution: Fixed
    • 1.4.0
    • 1.4.1, 1.5.0
    • SQL
    • None

    Description

      I am trying out 1.4.0 and notice there are some differences in behavior with Timestamp between 1.3.1 and 1.4.0.

      In 1.3.1, I can compare a Timestamp with string.

      scala> val df = sqlContext.createDataFrame(Seq((1, Timestamp.valueOf("2015-01-01 00:00:00")), (2, Timestamp.valueOf("2014-01-01 00:00:00"))))
      ...
      scala> df.filter($"_2" <= "2014-06-01").show
      ...
      _1 _2                  
      2  2014-01-01 00:00:...
      

      However, in 1.4.0, the filter is always false:

      scala> val df = sqlContext.createDataFrame(Seq((1, Timestamp.valueOf("2015-01-01 00:00:00")), (2, Timestamp.valueOf("2014-01-01 00:00:00"))))
      df: org.apache.spark.sql.DataFrame = [_1: int, _2: timestamp]
      
      scala> df.filter($"_2" <= "2014-06-01").show
      +--+--+
      |_1|_2|
      +--+--+
      +--+--+
      

      Not sure if that is intended, but I cannot find any doc mentioning these inconsistencies.

      Attachments

        Activity

          People

            marmbrus Michael Armbrust
            yipjustin Justin Yip
            Yin Huai Yin Huai
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: