Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-11724

Casting integer types to timestamp has unexpected semantics

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Minor
    • Resolution: Fixed
    • Affects Version/s: 1.6.0
    • Fix Version/s: 1.6.0
    • Component/s: SQL
    • Labels:
    • Target Version/s:

      Description

      Casting from integer types to timestamp treats the source int as being in millis. Casting from timestamp to integer types creates the result in seconds. This leads to behavior like:

      scala> sql("select cast(cast (1234 as timestamp) as bigint)").show
      +---+
      |_c0|
      +---+
      |  1|
      +---+
      

      Double's on the other hand treat it as seconds when casting to and from:

      scala> sql("select cast(cast (1234.5 as timestamp) as double)").show
      +------+
      |   _c0|
      +------+
      |1234.5|
      +------+
      

      This also breaks some other functions which return long in seconds, in particular, unix_timestamp.

      scala> sql("select cast(unix_timestamp() as timestamp)").show
      +--------------------+
      |                 _c0|
      +--------------------+
      |1970-01-17 10:03:...|
      +--------------------+
      
      scala> sql("select cast(unix_timestamp() *1000 as timestamp)").show
      +--------------------+
      |                 _c0|
      +--------------------+
      |2015-11-12 23:26:...|
      +--------------------+
      

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                nongli Nong Li
                Reporter:
                nongli Nong Li
              • Votes:
                0 Vote for this issue
                Watchers:
                4 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: