Details
-
Improvement
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
None
-
None
Description
Currently, when using ExecuteSQL, if a result set is very large, it can take quite a long time to pull back all of the results. It would be nice to have the ability to specify the maximum number of records to put into a FlowFile, so that if we pull back say 1 million records we can configure it to create 1000 FlowFiles, each with 1000 records. This way, we can begin processing the first 1,000 records while the next 1000 are being pulled from the remote database.
This suggestion comes from Vinay via the dev@ mailing list:
Is there way to have streaming feature when large result set is fetched from
database basically to reads data from the database in chunks of records
instead of loading the full result set into memory.
As part of ExecuteSQL can a property be specified called "FetchSize" which
Indicates how many rows should be fetched from the resultSet.
Since jam bit new in using NIFI , can any guide me on above.
Thanks in advance
Attachments
Issue Links
- links to