Details

    • Type: New Feature
    • Status: Resolved
    • Priority: Minor
    • Resolution: Won't Fix
    • Affects Version/s: 0.22.0
    • Fix Version/s: None
    • Component/s: None
    • Labels:
      None
    • Environment:

      Both platforms, Linux and Windows

      Description

      Ant tasks to make it easy to work with hadoop filesystem and submit jobs.

      <submit> : uploads JAR, submits job as user, with various settings

      filesystem operations: mkdir, copyin, copyout, delete
      -We could maybe use Ant1.7 "resources" here, and so use hdfs as a source or dest in Ant's own tasks

      1. security. Need to specify user; pick up user.name from JVM as default?
      2. cluster binding: namenode/job tracker (hostname,port) or url are all that is needed?
        #job conf: how to configure the job that is submitted? support a list of <property name="name" value="something"> children
      3. testing. AntUnit to generate <junitreport> compatible XML files
      4. Documentation. With an example using Ivy to fetch the JARs for the tasks and hadoop client.
      5. Polling: ant task to block for a job finished?

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                stevel@apache.org Steve Loughran
                Reporter:
                stevel@apache.org Steve Loughran
              • Votes:
                1 Vote for this issue
                Watchers:
                6 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved:

                  Time Tracking

                  Estimated:
                  Original Estimate - 168h
                  168h
                  Remaining:
                  Remaining Estimate - 168h
                  168h
                  Logged:
                  Time Spent - Not Specified
                  Not Specified