Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-12579

User-specified JDBC driver should always take precedence

Log workAgile BoardRank to TopRank to BottomAttach filesAttach ScreenshotVotersWatch issueWatchersCreate sub-taskConvert to sub-taskLinkCloneLabelsUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 1.6.1, 2.0.0
    • SQL
    • None

    Description

      Spark SQL's JDBC data source allows users to specify an explicit JDBC driver to load using the driver argument, but in the current code it's possible that the user-specified driver will not be used when it comes time to actually create a JDBC connection.

      In a nutshell, the problem is that you might have multiple JDBC drivers on your classpath that claim to be able to handle the same subprotocol and there doesn't seem to be an intuitive way to control which of those drivers takes precedence.

      Attachments

        Issue Links

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            joshrosen Josh Rosen Assign to me
            joshrosen Josh Rosen
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Issue deployment