Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-43359

DELETE from Hive table result in INTERNAL error

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 3.4.0
    • 3.5.0
    • Spark Core
    • None

    Description

      spark-sql (default)> CREATE TABLE T1(c1 INT);
      spark-sql (default)> DELETE FROM T1 WHERE c1 = 1;
      [INTERNAL_ERROR] Unexpected table relation: HiveTableRelation [`spark_catalog`.`default`.`t1`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, Data Cols: c1#3, Partition Cols: []]

      org.apache.spark.SparkException: [INTERNAL_ERROR] Unexpected table relation: HiveTableRelation [`spark_catalog`.`default`.`t1`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, Data Cols: c1#3, Partition Cols: []]
      at org.apache.spark.SparkException$.internalError(SparkException.scala:77)
      at org.apache.spark.SparkException$.internalError(SparkException.scala:81)
      at org.apache.spark.sql.execution.datasources.v2.DataSourceV2Strategy.apply(DataSourceV2Strategy.scala:310)
      at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$1(QueryPlanner.scala:63)
      at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
      at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
      at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:491)
      at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
      at org.apache.spark.sql.execution.SparkStrategies.plan(SparkStrategies.scala:70)
      at org.apache.spark.sql.catalyst.planning.QueryPlanner.$anonfun$plan$3(QueryPlanner.scala:78)

      Attachments

        Activity

          People

            panbingkun BingKun Pan
            srielau Serge Rielau
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: