Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-11817

insert of timestamp with factional seconds inserts a NULL

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 1.5.1
    • 1.5.3, 1.6.0
    • SQL
    • None

    Description

      Using the Thrift jdbc interface.

      The insert of the value of "1970-01-01 00:00:00.123456789" to a timestamp column, inserts a NULL into the database. I am aware the of the change
      From 1.5 releases notes Timestamp Type’s precision is reduced to 1 microseconds (1us). However, to be compatible with previous versions, I would suggest either rounding or truncating the fractional seconds not inserting a NULL.

      Attachments

        Activity

          yhuai Yin Huai added a comment -

          Here is an example. In 1.5 and 1.6, we have

          scala> sqlContext.sql("select cast('1970-01-01 00:00:00.123456789' as timestamp), cast('1970-01-01 00:00:00.123456' as timestamp)").show(10, false)
          +----+--------------------------+
          |_c0 |_c1                       |
          +----+--------------------------+
          |null|1970-01-01 00:00:00.123456|
          +----+--------------------------+
          

          In 1.4, we have

          sqlContext.sql("select cast('1970-01-01 00:00:00.123456789' as timestamp), cast('1970-01-01 00:00:00.123456' as timestamp)").collect
          res7: Array[org.apache.spark.sql.Row] = Array([1970-01-01 00:00:00.123456789,1970-01-01 00:00:00.123456])
          
          yhuai Yin Huai added a comment - Here is an example. In 1.5 and 1.6, we have scala> sqlContext.sql( "select cast ( '1970-01-01 00:00:00.123456789' as timestamp), cast ( '1970-01-01 00:00:00.123456' as timestamp)" ).show(10, false ) +----+--------------------------+ |_c0 |_c1 | +----+--------------------------+ | null |1970-01-01 00:00:00.123456| +----+--------------------------+ In 1.4, we have sqlContext.sql( "select cast ( '1970-01-01 00:00:00.123456789' as timestamp), cast ( '1970-01-01 00:00:00.123456' as timestamp)" ).collect res7: Array[org.apache.spark.sql.Row] = Array([1970-01-01 00:00:00.123456789,1970-01-01 00:00:00.123456])
          apachespark Apache Spark added a comment -

          User 'viirya' has created a pull request for this issue:
          https://github.com/apache/spark/pull/9834

          apachespark Apache Spark added a comment - User 'viirya' has created a pull request for this issue: https://github.com/apache/spark/pull/9834
          yhuai Yin Huai added a comment -

          Issue resolved by pull request 9834
          https://github.com/apache/spark/pull/9834

          yhuai Yin Huai added a comment - Issue resolved by pull request 9834 https://github.com/apache/spark/pull/9834

          People

            viirya L. C. Hsieh
            csands@progress.com Chip Sands
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: