Description
A simple way to reproduce it
scala> sql("create table t using hive as select 1 as i") res2: org.apache.spark.sql.DataFrame = [] scala> sql("select * from t").show +---+ | i| +---+ | 1| +---+ scala> sql("select * from spark_catalog.t").show org.apache.spark.sql.AnalysisException: Table or view not found: spark_catalog.t; line 1 pos 14; 'Project [*] +- 'UnresolvedRelation [spark_catalog, t]
The reason is that, we first go into `ResolveTables`, which lookups the table successfully, but then give up because it's a v1 table. Next we go into `ResolveRelations`, which do not recognize catalog name at all.
Similar to https://issues.apache.org/jira/browse/SPARK-29966 , we should make `ResolveRelations` responsible for lookup both v1 and v2 tables from the session catalog, and correctly recognize catalog name.
Attachments
Issue Links
- links to