Description
Currently, for JDBC, SQL TIME type represents incorrectly as Spark TimestampType. This should be represent as physical int in millis Represents a time of day, with no reference to a particular calendar, time zone or date, with a precision of one millisecond. It stores the number of milliseconds after midnight, 00:00:00.000.
We encountered the issue of Avro logical type of `TimeMillis` not being converted correctly to Spark `Timestamp` struct type using the `SchemaConverters`, but it converts to regular `int` instead. Reproducible by ingest data from MySQL table with a column of TIME type: Spark JDBC dataframe will get the correct type (Timestamp), but enforcing our avro schema (`
{"type": "int"," logicalType": "time-millis"}`) externally will fail to apply with the following exception:
java.lang.RuntimeException: java.sql.Timestamp is not a valid external type for schema of int
Attachments
Issue Links
- causes
-
SPARK-34180 Fix the regression brought by SPARK-33888 for PostgresDialect
-
- Resolved
-
- links to