Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-5472

Add support for reading from and writing to a JDBC database

    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Blocker
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 1.3.0
    • Component/s: SQL
    • Labels:
      None
    • Target Version/s:

      Description

      It would be nice to be able to make a table in a JDBC database appear as a table in Spark SQL. This would let users, for instance, perform a JOIN between a DataFrame in Spark SQL with a table in a Postgres database.

      It might also be nice to be able to go the other direction – save a DataFrame to a database – for instance in an ETL job.

      Edited to clarify: Both of these tasks are certainly possible to accomplish at the moment with a little bit of ad-hoc glue code. However, there is no fundamental reason why the user should need to supply the table schema and some code for pulling data out of a ResultSet row into a Catalyst Row structure when this information can be derived from the schema of the database table itself.

        Attachments

          Activity

            People

            • Assignee:
              tmyklebu Tor Myklebust
              Reporter:
              tmyklebu Tor Myklebust
            • Votes:
              0 Vote for this issue
              Watchers:
              7 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: