Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-26234

Column list specification in INSERT statement

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Duplicate
    • 2.3.0
    • None
    • PySpark
    • None

    Description

      While trying to OVERWRITE the Hive table with specific columns from Spark(Pyspark) using a dataframe getting the below error

      pyspark.sql.utils.ParseException: u"\nmismatched input 'col1' expecting
      Unknown macro: {'(', 'SELECT', 'FROM', 'VALUES', 'TABLE', 'INSERT', 'MAP', 'REDUCE'}
      (line 1, pos 36)\n\n== SQL ==\ninsert into table DB.TableName (Col1, Col2, Col3) select Col1, Col2, Col3 FROM dataframe\n------------------------------------^^^\n"

      sparkSession.sql("insert into table DB.TableName (Col1, Col2, Col3) select Col1, Col2, Col3 FROM dataframe")

      But on trying the same via Hive Terminal goes through fine.

      Please check the below link to get more info on the same.

      https://stackoverflow.com/questions/53517671/column-list-specification-in-insert-overwrite-statement

       

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              JojeJoby Joby Joje
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: