Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-6483

Provide Hadoop as a Service based on standards


    • Type: New Feature
    • Status: Resolved
    • Priority: Major
    • Resolution: Incomplete
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: None
    • Labels:


      Hadoop as a Service provides a standards-based web services interface that layers on top of Hadoop on Demand and allows Hadoop jobs to be submitted via popular schedulers, such as Sun Grid Engine (SGE), Platform LSF, Microsoft HPC Server 2008 etc., to local or remote Hadoop clusters. This allows multiple Hadoop clusters within an organization to be efficiently shared and provides flexibility, allowing remote Hadoop clusters, offered as Cloud services, to be used for experimentation and burst capacity. HaaS hides complexity, allowing users to submit many types of compute or data intensive work via a single scheduler without actually knowing where it will be done. Additionally providing a standards-based front-end to Hadoop means that users would be able to easily choose HaaS providers without being locked in, i.e. via proprietary interfaces such as Amazon's map/reduce service.

      Our HaaS implementation uses the OGF High Performance Computing Basic Profile standard to define interoperable job submission descriptions and management interfaces to Hadoop. It uses Hadoop on Demand to provision capacity. Our HaaS implementation also supports files stage in/out with protocols like FTP, SCP and GridFTP.

      Our HaaS implementation also provides a suit of RESTful interface which compliant with HPC-BP.


        1. SC08-HPCBPforHadoop.ppt
          1.31 MB
          Yang Zhou
        2. OGF27-HPCBPforHadoop.ppt
          937 kB
          Yang Zhou



            • Assignee:
              woodyzhou Yang Zhou
            • Votes:
              1 Vote for this issue
              19 Start watching this issue


              • Created: