Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
4.7.0
-
None
-
None
Description
The current implementation had the 'DATE' datatype bound to a Spark SQL 'TimestampType', which causes a casting error trying to convert from java.sql.Date to java.sql.Timestamp when using the DataFrame API with Phoenix DATE columns.
This patch modifies the schema handling to treat DATE columns as the Spark 'DateType' instead. Note that Spark drops the hour, minute and second values from these when interfacing using DataFrames. This follows the java.sql.Date spec, but might not useful to folks who rely on the hour/minute/second fields working using the DataFrame API and DATE columns. A future improvement would be to force these to be TimestampTypes instead to preserve information, but it's less intuitive and probably shouldn't be the default behaviour.