Details
-
Improvement
-
Status: Closed
-
Minor
-
Resolution: Fixed
-
4.7.0
-
None
-
None
-
Patch
Description
The Phoenix DATE type is internally represented as an 8 bytes, which can store a full 'yyyy-MM-dd hh:mm:ss' time component. However, Spark SQL follows the SQL Date spec and keeps only the 'yyyy-MM-dd' portion as a 4 byte type. When loading Phoenix DATE columns using the Spark DataFrame API, the 'hh:mm:ss' component is lost.
This patch allows setting a new 'dateAsTimestamp' option when loading a DataFrame, which will coerce the underlying Date object to a Timestamp so that the full time component is loaded.