Details
-
New Feature
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
None
Description
Motivation
User stories:
As a Flink user, I’d like to use DynamoDB as sink for my data pipeline.
Scope:
- Implement an asynchronous sink for DynamoDB by inheriting the AsyncSinkBase class. The implementation can for now reside in its own module in flink-connectors.
- Implement an asynchornous sink writer for DynamoDB by extending the AsyncSinkWriter. The implementation must deal with failed requests and retry them using the requeueFailedRequestEntry method. If possible, the implementation should batch multiple requests (PutRecordsRequestEntry objects) to Firehose for increased throughput. The implemented Sink Writer will be used by the Sink class that will be created as part of this story.
- Java / code-level docs.
- End to end testing: add tests that hits a real AWS instance. (How to best donate resources to the Flink project to allow this to happen?)
References
More details to be found https://cwiki.apache.org/confluence/display/FLINK/FLIP-171%3A+Async+Sink
Attachments
Issue Links
- causes
-
FLINK-25859 Documentation for DynamoDB Sink
- Resolved
- Dependent
-
FLINK-24041 [FLIP-171] Generic AsyncSinkBase
- Resolved
- is a parent of
-
FLINK-29895 Improve code coverage and integration tests for DynamoDB implementation of Async Sink
- Resolved
- is cloned by
-
FLINK-25731 Mark FlinkKinesisProducer/FlinkKinesisConsumer as deprecated
- Closed
- supercedes
-
FLINK-16504 Add a AWS DynamoDB sink
- Closed
- links to
- mentioned in
-
Page Loading...