Details
-
Improvement
-
Status: Resolved
-
P3
-
Resolution: Implemented
-
None
Description
Some users may want to protect their sensitive data using tokenization.
We propose to create a Beam example template that will demonstrate Beam transform to protect sensitive data using tokenization. In our example, we will use an external service for the data tokenization.
At a high level, a pipeline that will:
- support batch (GCS) and streaming (Pub/Sub) input sources
- tokenize sensitive data via external REST service - we are about to use Protegrity
- output tokenized data into BigQuery or BigTable
Attachments
Issue Links
- links to