Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-30828

Improve insertInto behaviour

    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Minor
    • Resolution: Won't Fix
    • Affects Version/s: 3.1.0
    • Fix Version/s: None
    • Component/s: Spark Core, SQL
    • Labels:
      None
    • Target Version/s:

      Description

      Actually when you call insertInto to add a dataFrame into an existing table the only safety check is that the number of columns match, but the order doesn't matter, and the message in case that the number of columns doesn't match is not very helpful, specially when you have  a lot of columns:

       org.apache.spark.sql.AnalysisException: `default`.`table` requires that the data to be inserted have the same number of columns as the target table: target table has 2 column(s) but the inserted data has 1 column(s), including 0 partition column(s) having constant value(s).; 

      I think a standard column check would be very helpful, just like in almost other cases with Spark:

       

      "cannot resolve 'p2' given input columns: [id, p1];"  
      

       

       

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                apachespark Apache Spark
                Reporter:
                gschiavon German Schiavon Matteo
              • Votes:
                0 Vote for this issue
                Watchers:
                1 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: