Details
-
New Feature
-
Status: Resolved
-
Major
-
Resolution: Incomplete
-
None
-
None
Description
This is a ticket to track progress to support federating multiple external catalogs. This would require establishing an API (similar to the current ExternalCatalog API) for getting information about external catalogs, and ability to convert a table into a data source table.
As part of this, we would also need to be able to support more than a two-level table identifier (database.table). At the very least we would need a three level identifier for tables (catalog.database.table). A possibly direction is to support arbitrary level hierarchical namespaces similar to file systems.
Once we have this implemented, we can convert the current Hive catalog implementation into an external catalog that is "mounted" into an internal catalog.
Attachments
Attachments
Issue Links
- is related to
-
SPARK-23443 Spark with Glue as external catalog
- Open
- relates to
-
SPARK-15691 Refactor and improve Hive support
- Resolved
-
SPARK-24814 Relationship between catalog and datasources
- Resolved
-
SPARK-24017 Refactor ExternalCatalog to be an interface
- Resolved
1.
|
Spark SQL ExternalCatalog API custom implementation support | Closed | Unassigned |