Details

    • Type: New Feature New Feature
    • Status: Open
    • Priority: Major Major
    • Resolution: Unresolved
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: None
    • Labels:
      None

      Description

      I was able to compile 0.18.2 in eclipse into a new OSGi bundle using eclipse PDE. Using Spring to control the HDFS nodes, however, seems out of the question for the time being because of inter-dependencies between packages that should be separate OSGi bundles (for example, SecondaryNameNode includes direct references to StatusHttpServer, which should be in a bundle with a "web" personality that is separate from Hadoop Core). Looking through the code that starts the daemons, it would seem code changes are necessary to allow for components to be dependency-injected. Rather than instantiating a StatusHttpServer inside the SecondaryNameNode, that reference should (at the very least) be able to be dependency-injected (for example from an OSGi service from another bundle). Adding setters for infoServer would allow that reference to be injected by Spring. This is just an example of the changes that would need to be made to get Hadoop to live happily inside an OSGi container.

      As a starting point, it would be nice if Hadoop core was able to be split into a client bundle that could be deployed into OSGi containers that would provide client-only access to HDFS clusters.

      1. HDFS-201.patch
        15 kB
        Jean-Baptiste Onofré

        Activity

          People

          • Assignee:
            Jean-Baptiste Onofré
            Reporter:
            Jon Brisbin
          • Votes:
            0 Vote for this issue
            Watchers:
            6 Start watching this issue

            Dates

            • Created:
              Updated:

              Development