Details
-
New Feature
-
Status: Resolved
-
Major
-
Resolution: Duplicate
-
None
-
None
-
None
-
None
Description
I was able to compile 0.18.2 in eclipse into a new OSGi bundle using eclipse PDE. Using Spring to control the HDFS nodes, however, seems out of the question for the time being because of inter-dependencies between packages that should be separate OSGi bundles (for example, SecondaryNameNode includes direct references to StatusHttpServer, which should be in a bundle with a "web" personality that is separate from Hadoop Core). Looking through the code that starts the daemons, it would seem code changes are necessary to allow for components to be dependency-injected. Rather than instantiating a StatusHttpServer inside the SecondaryNameNode, that reference should (at the very least) be able to be dependency-injected (for example from an OSGi service from another bundle). Adding setters for infoServer would allow that reference to be injected by Spring. This is just an example of the changes that would need to be made to get Hadoop to live happily inside an OSGi container.
As a starting point, it would be nice if Hadoop core was able to be split into a client bundle that could be deployed into OSGi containers that would provide client-only access to HDFS clusters.
Attachments
Attachments
Issue Links
- is duplicated by
-
HADOOP-7977 Allow Hadoop clients and services to run in an OSGi container
- Resolved
Patch turning HDFS as an OSGi bundle, and providing a hadoop-hdfs Karaf feature.