-
Type:
Bug
-
Status: Open
-
Priority:
Major
-
Resolution: Unresolved
-
Affects Version/s: 2.4.3
-
Fix Version/s: None
-
Component/s: Input/Output
-
Labels:None
AggregatedDialect defines getJDBCType as:
dialects.flatMap(_.getJDBCType(dt)).headOption
However, when attempting to write a ByteType, PostgreDialect currently throws: https://github.com/apache/spark/blob/1217996f1574f758d8cccc1c4e3846452d24b35b/sql/core/src/main/scala/org/apache/spark/sql/jdbc/PostgresDialect.scala#L83
This prevents any other registered dialects from providing a JDBC type for ByteType as the last one (Spark's default PostgresDialect) will raise an uncaught exception.
https://github.com/apache/spark/pull/24845 addresses this by providing a mapping, but the general problem holds if any JdbcDialect implementations ever throw in getJDBCType for any reason.