Details
-
Bug
-
Status: Open
-
Major
-
Resolution: Unresolved
-
0.20.2
-
None
-
None
-
Unix, Java 1.6, hadoop 0.20.2
Description
The following work:
hadoop dfs --put test^ing /tmp
hadoop dfs --ls /tmp
The following do not:
hadoop dfs --ls /tmp/test^ing
hadoop dfs --get /tmp/test^ing test^ing
The first fails with "ls: Cannot access /tmp/test^ing: No such file or directory."
The second fails with "get: null".
It is possible to put a file with some special characters, such as ^ using the hadoop shell. But once put one cannot ls, cat, or get the file due to the way some commands deal with file globbing. Harsh J suggested on the mailing list that perhaps a flag that would turn off globbing could be implemented. Perhaps something like single quoting the file path on the command line to disable globbing would work as well.
As an example in the source for 0.20.2 the ^ character in particular wasn't escaped in in the output pattern in FileSystem.java @line 1050 in setRegex(String filePattern).:
...
} else if (pCh == '[' && setOpen == 0)
else if (pCh == '^' && setOpen > 0) {
} else if (pCh == '-' && setOpen > 0) {
// Character set range
setRange = true;
...
After looking in trunk, it seems to have been dealt with in later versions (refactored into GlobPattern.java)
...
case '^': // ^ inside [...] can be unescaped
if (setOpen == 0)
break;
case '!': //
...
but even after pushing that back in 0.20.2 and testing it appears to resolve the issue for commands like ls, but not for get. So perhaps there is more to be done for other commands?
Attachments
Issue Links
- is related to
-
HADOOP-13099 Glob should return files with special characters in name
- Resolved