Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-33392 Align DSv2 commands to DSv1 implementation
  3. SPARK-33904

Recognize `spark_catalog` in `saveAsTable()` and `insertInto()`

    XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.2.0
    • 3.2.0
    • SQL
    • None

    Description

      The v1 INSERT INTO command recognizes `spark_catalog` as the default session catalog:

      spark-sql> create table spark_catalog.ns.tbl (c int);
      spark-sql> insert into spark_catalog.ns.tbl select 0;
      spark-sql> select * from spark_catalog.ns.tbl;
      0
      

      but the `saveAsTable()` and `insertInto()` methods don't allow to write a table with explicitly specified catalog spark_catalog:

      scala> sql("CREATE NAMESPACE spark_catalog.ns")
      scala> Seq(0).toDF().write.saveAsTable("spark_catalog.ns.tbl")
      org.apache.spark.sql.AnalysisException: Couldn't find a catalog to handle the identifier spark_catalog.ns.tbl.
        at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:629)
        ... 47 elided
      scala> Seq(0).toDF().write.insertInto("spark_catalog.ns.tbl")
      org.apache.spark.sql.AnalysisException: Couldn't find a catalog to handle the identifier spark_catalog.ns.tbl.
        at org.apache.spark.sql.DataFrameWriter.insertInto(DataFrameWriter.scala:498)
        ... 47 elided
      

      Attachments

        Activity

          People

            maxgekk Max Gekk
            maxgekk Max Gekk
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: