I'm still not sure that I agree with you on the time conversion issue...
I'm not sure what the "correct" answer is, but I'm experiencing inconsistent/unexpected behavior depending on how I convert the timestamp.
From my point of view, if a timezone isn't specified then 0 = 1970-01-01 00:00:00
select cast(0.0 as timestamp) from test limit 1
select cast('1970-01-01 00:00:00' as timestamp) from test limit 1
Shouldn't those two results be the same?
This is the "inconsistency" I reference.
select to_utc_timestamp(0.0,'GMT') from test limit 1
select to_utc_timestamp('1970-01-01 00:00:00','GMT') from test limit 1
The same inconsistency is produced here....but the behavior seemed to be related to the cast() conversion taking place in to_utc_timestamp, not in the UDF itself...if you still think that this is actually an issue with the UDF itself I can file a separate JIRA for that, but to me, all of this seems to be related back to cast()
In this case, my cluster is set to America/Denver timezone, so if I attempted to set a local epoch time to a timestamp in what to me is a logical approach, this appears to work:
select to_utc_timestamp(0.0,'America/Denver') from test limit 1
However, I say that it "appears" to work, because if I the retrieve that same stored timestamp using the from_utc_timestamp() UDF:
select from_utc_timestamp(to_utc_timestamp(0.0,'America/Denver'),'America/Denver') from test limit 1
which again is different than if I had done:
"select from_utc_timestamp(to_utc_timestamp('1970-01-01 00:00:00','America/Denver'),'America/Denver') from test limit 1
The problem is that cast(STRING datetime to TIMESTAMP) produces different results from cast(int/float datetime to TIMESTAMP)