Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
0.21.0
-
None
-
Reviewed
Description
Bugs:
1. hadoop-version is not recognized if run ant command from src/contrib/ or from src/contrib/hdfsproxy
If running ant command from $HADOOP_HDFS_HOME, hadoop-version will be passed to contrib's build through subant. But if running from src/contrib or src/contrib/hdfsproxy, the hadoop-version will not be recognized.
2. LdapIpDirFilter.java is not thread safe. userName, Group & Paths are per request and can't be class members.
3. Addressed the following StackOverflowError.
ERROR [org.apache.catalina.core.ContainerBase.[Catalina].[localh
ost].[/].[proxyForward]] Servlet.service() for servlet proxyForward threw exception
java.lang.StackOverflowError
at org.apache.catalina.core.ApplicationHttpRequest.getAttribute(ApplicationHttpR
equest.java:229)
This is due to when the target war (/target.war) does not exist, the forwarding war will forward to its parent context path /, which defines the forwarding war itself. This cause infinite loop. Added "HDFS Proxy Forward".equals(dstContext.getServletContextName() in the if logic to break the loop.
4. Kerberos credentials of remote user aren't available. HdfsProxy needs to act on behalf of the real user to service the requests
Attachments
Attachments
Issue Links
- duplicates
-
HDFS-1009 Support Kerberos authorization in HDFSProxy
- Closed
- incorporates
-
HDFS-482 change HsftpFileSystem's ssl.client.do.not.authenticate.server configuration setting to ssl-client.xml
- Closed
- is required by
-
HDFS-1012 documentLocation attribute in LdapEntry for HDFSProxy isn't specific to a cluster
- Closed