Details
-
Improvement
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
None
-
None
Description
While Apache NiFi has provenance and event level lineage support within its data flow, Apache Atlas also does manage lineage between dataset and process those interacting with such data.
It would be beneficial for users who use both NiFi and Atlas and if they can see end-to-end data lineage on Atlas lineage graph, as some type of dataset are processed by both NiFi and technologies around Atlas such as Storm, Falcon or Sqoop. For example, Kafka topics and Hive tables.
In order to make this integration happen, I propose a NiFi reporting task that analyzes NiFi flow then creates DataSet and Process entities in Atlas.
The challenge is how to design NiFi flow dataset level lineage within Atlas lineage graph.
If we just add a single NiFi process and connect every DataSet from/to it, it would be too ambiguous since it won't be clear which part of a NiFi flow actually interact with certain dataset.
But if we put every NiFi processor as independent process in Atlas, it would be too granular, too. Also, we already have detailed event level lineage in NiFi, we wouldn't need the same level in Atlas.
If we can group certain processors in a NiFI flow as a process in Atlas, it would be a nice granularity.
Attachments
Issue Links
- is related to
-
NIFI-4654 RAW S2S transit URI should contain Port ID instead of FlowFile ID
- Resolved
- requires
-
NIFI-4543 Improve HBase processors provenance transit URL
- Resolved
-
NIFI-4544 Improve HDFS processors provenance transit URL
- Resolved
-
NIFI-4545 Improve Hive processors provenance transit URL
- Resolved
-
NIFI-4546 Make ReportingTask aware of node type in a cluster
- Resolved
-
NIFI-4547 Add utility to consume Provenance events by different ReportingTasks
- Resolved
-
NIFI-4548 Add REMOTE_INVOCATION provenance event type
- Resolved
- links to