Currently, Spark can parse strings with timestamps from JSON/CSV in millisecond precision. Internally, timestamps have microsecond precision. The ticket aims to modify parsing logic in Spark 2.4 to support the microsecond precision. Porting of DateFormatter/TimestampFormatter from Spark 3.0-preview is risky, so, need to find another lighter solution.
- is related to
-
SPARK-27224 Spark to_json parses UTC timestamp incorrectly
-
- Resolved
-
-
SPARK-6385 ISO 8601 timestamp parsing does not support arbitrary precision second fractions
-
- Resolved
-
-
SPARK-10681 DateTimeUtils needs a method to parse string to SQL's timestamp value
-
- Resolved
-
- is required by
-
SPARK-29927 Parse timestamps in microsecond precision by `to_timestamp`, `to_unix_timestamp`, `unix_timestamp`
-
- Resolved
-
- links to