Description
It would be great if Spark supported local times in DataFrames, rather than only instants.
The specific use case I have in mind is something like
- parse "2019-01-01 17:00" (no timezone) from CSV -> LocalDateTime in dataframe
- save to Parquet: LocalDateTime is stored with same integer value as 2019-01-01 17:00 UTC, but with isAdjustedToUTC=false. (Currently Spark saves either INT96 or TIME_MILLIS/TIME_MICROS which has isAdjustedToUTC=true)
Attachments
Issue Links
- duplicates
-
SPARK-35662 Support Timestamp without time zone data type
-
- Resolved
-