Type: New Feature
Resolution: Won't Fix
Affects Version/s: 0.22.0
Fix Version/s: None
Both platforms, Linux and Windows
Ant tasks to make it easy to work with hadoop filesystem and submit jobs.
<submit> : uploads JAR, submits job as user, with various settings
filesystem operations: mkdir, copyin, copyout, delete
-We could maybe use Ant1.7 "resources" here, and so use hdfs as a source or dest in Ant's own tasks
- security. Need to specify user; pick up user.name from JVM as default?
- cluster binding: namenode/job tracker (hostname,port) or url are all that is needed?
#job conf: how to configure the job that is submitted? support a list of <property name="name" value="something"> children
- testing. AntUnit to generate <junitreport> compatible XML files
- Documentation. With an example using Ivy to fetch the JARs for the tasks and hadoop client.
- Polling: ant task to block for a job finished?
|Status||Open [ 1 ]||Resolved [ 5 ]|
|Resolution||Won't Fix [ 2 ]|