Hadoop streaming exposes all the key values in job conf as environment variables when it forks a process for streaming code to run. Unfortunately the variable mapreduce_input_fileinputformat_inputdir contains the list of input files, and Linux has a limit on size of environment variables + arguments.
Based on how long the list of files and their full path is this could be pretty huge. And given all of these variables are not even used it stops user from running hadoop job with large number of files, even though it could be run.
Linux throws E2BIG if the size is greater than certain size which is error code 7. And java translates that to "error=7, Argument list too long". More: http://man7.org/linux/man-pages/man2/execve.2.html I suggest skipping variables if it is greater than certain length. That way if user code requires the environment variable it would fail. It should also introduce a config variable to skip long variables, and set it to false by default. That way user has to specifically set it to true to invoke this feature.
Here is the exception:
Hive does a similar trick:
HIVE-2372 I have a patch for this, will soon submit a patch.