Description
Spark SQL's JDBC data source allows users to specify an explicit JDBC driver to load using the driver argument, but in the current code it's possible that the user-specified driver will not be used when it comes time to actually create a JDBC connection.
In a nutshell, the problem is that you might have multiple JDBC drivers on your classpath that claim to be able to handle the same subprotocol and there doesn't seem to be an intuitive way to control which of those drivers takes precedence.
Attachments
Issue Links
- is duplicated by
-
SPARK-12563 "No suitable driver" when calling JdbcUtils.saveTable in isolation
- Resolved
- links to