Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-12579

User-specified JDBC driver should always take precedence

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 1.6.1, 2.0.0
    • Component/s: SQL
    • Labels:
      None
    • Target Version/s:

      Description

      Spark SQL's JDBC data source allows users to specify an explicit JDBC driver to load using the driver argument, but in the current code it's possible that the user-specified driver will not be used when it comes time to actually create a JDBC connection.

      In a nutshell, the problem is that you might have multiple JDBC drivers on your classpath that claim to be able to handle the same subprotocol and there doesn't seem to be an intuitive way to control which of those drivers takes precedence.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                joshrosen Josh Rosen
                Reporter:
                joshrosen Josh Rosen
              • Votes:
                0 Vote for this issue
                Watchers:
                4 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: