don't see much use in having tests that will hardly ever be run because they need to be run in a complex 4 step semi-manual process requiring root privileges.
When we were developing for 0.20.* security line of releases, there was time when we didn't have these tests and got bitten by bugs which could only be verified by running tests on clusters, a not-so-cheap process. The fact that you need a complex setup include root priveleges prevented us from setting up Jenkins to run these tests. So, we wired these up and developers used to run these manually whenever there was a change in these parts of the code, and I can personally vouch for the fact that these were of great help for us over time even if runnable only in dev environments.
Is there a reason that container-executor.conf.dir, HADOOP_HOME to the C code, must be set at compile time? Why can't we pass it in on the command line.
It is related to security. We didn't want to make the binary read from the (huge) *-site.xml configuration files, but having a separate configuration file mandated that it be as secure as possible. That said, we slowly started relying on the binary being only runnable by NM/TT, so this configuration can be removed in favour of command line arguments. I'm planning to get that done via
MAPREDUCE-3121 which needs changes in the same areas of code.