Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-17971

Unix timestamp handling in Spark SQL not allowing calculations on UTC times

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Not A Problem
    • 1.6.2
    • None
    • Spark Core, SQL
    • None
    • MacOS X JDK 7

    Description

      In our Spark data pipeline we store timed events using a bigint column called 'timestamp', the values contained being Unix timestamp time points.

      Our datacenter servers Java VMs are all set up to start with timezone set to UTC, while developer's computers are all in the US Eastern timezone.

      Given how Spark SQL datetime functions work, it's impossible to do calculations (eg. extract and compare hours, year-month-date triplets) using UTC values:

      • from_unixtime takes a bigint unix timestamp and forces it to the computer's local timezone;
      • casting the bigint column to timestamp does the same (it converts it to the local timezone)
      • from_utc_timestamp works in the same way, the only difference being that it gets a string as input instead of a bigint.

      The result of all of this is that it's impossible to extract individual fields of a UTC timestamp, since all timestamp always get converted to the local timezone.

      Attachments

        Activity

          People

            Unassigned Unassigned
            gdelprete-tl Gabriele Del Prete
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: