Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-12010

Spark JDBC requires support for column-name-free INSERT syntax

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 1.5.2
    • 1.6.1, 2.0.0
    • SQL
    • None

    Description

      Spark JDBC write only works with technologies which support the following INSERT statement syntax (JdbcUtils.scala: insertStatement()):

      INSERT INTO $table VALUES ( ?, ?, ..., ? )

      Some technologies require a list of column names:

      INSERT INTO $table ( $colNameList ) VALUES ( ?, ?, ..., ? )

      Therefore technologies like Progress JDBC Driver for Cassandra do not work with Spark JDBC write.

      Idea for fix:
      Move JdbcUtils.scala:insertStatement() into SqlDialect and add a SqlDialect for Progress JDBC Driver for Cassandra

      Attachments

        Issue Links

          Activity

            huaxing Huaxin Gao added a comment -

            I would like to work on this problem.

            huaxing Huaxin Gao added a comment - I would like to work on this problem.
            CK49 Christian Kurz added a comment -

            Hi Huaxin,
            thank you for your kind offer.
            Actually I already have some code suggestion available, which I hope to create a pull request for shortly. May be you could review and add your thoughts to the pull request?
            Thanks,
            Christian

            CK49 Christian Kurz added a comment - Hi Huaxin, thank you for your kind offer. Actually I already have some code suggestion available, which I hope to create a pull request for shortly. May be you could review and add your thoughts to the pull request? Thanks, Christian
            apachespark Apache Spark added a comment -

            User 'CK50' has created a pull request for this issue:
            https://github.com/apache/spark/pull/10003

            apachespark Apache Spark added a comment - User 'CK50' has created a pull request for this issue: https://github.com/apache/spark/pull/10003

            Thanks for working on this, but we've already hit code freeze for 1.6.0 so I'm going to retarget. Typically let project committers set the "target version".

            marmbrus Michael Armbrust added a comment - Thanks for working on this, but we've already hit code freeze for 1.6.0 so I'm going to retarget. Typically let project committers set the "target version".
            apachespark Apache Spark added a comment -

            User 'CK50' has created a pull request for this issue:
            https://github.com/apache/spark/pull/10066

            apachespark Apache Spark added a comment - User 'CK50' has created a pull request for this issue: https://github.com/apache/spark/pull/10066
            apachespark Apache Spark added a comment -

            User 'CK50' has created a pull request for this issue:
            https://github.com/apache/spark/pull/10312

            apachespark Apache Spark added a comment - User 'CK50' has created a pull request for this issue: https://github.com/apache/spark/pull/10312
            srowen Sean R. Owen added a comment -

            Issue resolved by pull request 10380
            https://github.com/apache/spark/pull/10380

            srowen Sean R. Owen added a comment - Issue resolved by pull request 10380 https://github.com/apache/spark/pull/10380

            People

              CK49 Christian Kurz
              CK49 Christian Kurz
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved:

                Time Tracking

                  Estimated:
                  Original Estimate - 24h
                  24h
                  Remaining:
                  Remaining Estimate - 24h
                  24h
                  Logged:
                  Time Spent - Not Specified
                  Not Specified