Details
-
New Feature
-
Status: Open
-
Major
-
Resolution: Unresolved
-
3.0.0-alpha1
-
None
-
None
Description
For use cases that deal with sensitive data, we often need to encrypt data to be stored safely at rest. Hadoop common provides a codec framework for compression algorithms. We start here. However because encryption algorithms require some additional configuration and methods for key management, we introduce a crypto codec framework that builds on the compression codec framework. It cleanly distinguishes crypto algorithms from compression algorithms, but shares common interfaces between them where possible, and also carries extended interfaces where necessary to satisfy those needs. We also introduce a generic Key type, and supporting utility methods and classes, as a necessary abstraction for dealing with both Java crypto keys and PGP keys.
The task for this feature breaks into two parts:
1. The crypto codec framework that based on compression codec which can be shared by all crypto codec implementations.
2. The codec implementations such as AES and others.
Attachments
Attachments
Issue Links
- is depended upon by
-
MAPREDUCE-5025 Key Distribution and Management for supporting crypto codec in Map Reduce
- Open
- is required by
-
PIG-3289 Encryption aware load and store functions
- Open
-
HIVE-5207 Support data encryption for Hive tables
- Resolved
-
HADOOP-9997 Improve TFile API to be able to pass the context for encryption codecs
- Open
- relates to
-
ORC-14 Add column level encryption to ORC files
- Closed
1.
|
Hadoop crypto codec framework based on compression codec | Patch Available | Unassigned |
|
|||||||
2.
|
Crypto codec implementations for AES | Resolved | Yi Liu | ||||||||
3.
|
A TokenKeyProvider for a Centralized Key Manager Server (BEE: bee-key-manager) | Patch Available | Unassigned | ||||||||
4.
|
A TokenKeyProvider for a Centralized Key Manager Server (BEE: bee-key-manager) | Resolved | Unassigned |