This is MapReduce part of HADOOP-6332
test-patch needs to verify Herriot integrity
Task Killing tests
Utilities for system tests specific.
Herriot's artifact id for Maven deployment should be set to hadoop-core-instrumented
[Herriot] Cleanup of temp. configurations is needed upon restart of a cluster
Automatic resolution of Lzo codecs is needed.
Process tree clean up of either a failed task or killed task tests.
Process tree clean up of exceeding memory limit tasks.
Process tree clean up suspended task tests.
[Herriot] Task Killing/Failing tests for a streaming job.
[Herriot] Implement a functionality for getting the job summary information of a job.
[Herriot] Test jobsummary information for different jobs.
[Herriot] Test Job cache directories cleanup after job completes.
[Herriot] TaskMemoryManager should log process-tree's status while killing tasks
Ant build changes for Streaming system tests in contrib projects.
[Herriot] Implement a functionality for getting the user list for creating proxy users.
New properties for suspend and resume process.
[Herriot] New property for multi user list.
Large-scale Automated Test Framework
Herriot tests for MapReduce should support submission into a specific queue
Automate system test case for checking the file permissions in mapred.local.dir
Automate the job killing system test case.
Test scenario for "Killing Task Attempt id till job fails"
Create test scenario for "distributed cache file behaviour, when dfs file is not modified"
Create test scenario for "distributed cache file behaviour, when dfs file is modified"
Test scenario for a distributed cache file behaviour when the file is private
Automate test scenario for successful/killed jobs' memory is properly removed from jobtracker after these jobs retire.
Automate the test scenario of job related files are moved from history directory to done directory
Test the job status of lost task trackers before and after the timeout.