The document at http://atlas.apache.org/Hook-Hive.html only mentions that import-hive.sh is used to import hive metadata.
It would be nice to mention more details, such as
- Before Atals can get metadata updates through Kafka messaging, it needs to initialize its hive metadata state with databases/tables present in Apache Hive. This is only done once or when Atlas is out of sync with Hive for whatever reason.
- The import script actually calls HiveMetaStoreBridge to do the real work, which eventually fetch database metadatas using hive metastore client via IMetaStoreClient API.
Such detail would help integrating Atlas with query engine other than Hive.