Hadoop HDFS
  1. Hadoop HDFS
  2. HDFS-3395

NN doesn't start with HA+security enabled and HTTP address set to 0.0.0.0

    Details

    • Type: Bug Bug
    • Status: Resolved
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: 2.0.0-alpha
    • Fix Version/s: 2.0.0-alpha
    • Component/s: namenode
    • Labels:
      None

      Description

      DFSUtil#substituteForWildcardAddress subs in a default hostname if the given hostname is 0.0.0.0. However, this function throws an exception if the given hostname is set to 0.0.0.0 and security is enabled, regardless of whether the default hostname is also 0.0.0.0. This function shouldn't throw an exception unless both addresses are set to 0.0.0.0.

      1. HDFS-3395.patch
        2 kB
        Aaron T. Myers

        Activity

        Hide
        Aaron T. Myers added a comment -

        Here's a patch which addresses the issue.

        No tests are included since security has to be enabled to test this code path. I tested it manually by setting the NN HTTP addresses to be wildcard in a secure HA setup, and confirming that checkpoints still proceed as expected.

        This patch also takes the liberty of fixing two typos I noticed in javadocs in the course of researching this bug.

        Show
        Aaron T. Myers added a comment - Here's a patch which addresses the issue. No tests are included since security has to be enabled to test this code path. I tested it manually by setting the NN HTTP addresses to be wildcard in a secure HA setup, and confirming that checkpoints still proceed as expected. This patch also takes the liberty of fixing two typos I noticed in javadocs in the course of researching this bug.
        Hide
        Eli Collins added a comment -

        Looks good. +1 pending jenkins (please merge to branch-2-alpha as well)

        Show
        Eli Collins added a comment - Looks good. +1 pending jenkins (please merge to branch-2-alpha as well)
        Hide
        Hadoop QA added a comment -

        -1 overall. Here are the results of testing the latest attachment
        http://issues.apache.org/jira/secure/attachment/12526302/HDFS-3395.patch
        against trunk revision .

        +1 @author. The patch does not contain any @author tags.

        -1 tests included. The patch doesn't appear to include any new or modified tests.
        Please justify why no new tests are needed for this patch.
        Also please list what manual steps were performed to verify this patch.

        +1 javadoc. The javadoc tool did not generate any warning messages.

        +1 javac. The applied patch does not increase the total number of javac compiler warnings.

        +1 eclipse:eclipse. The patch built with eclipse:eclipse.

        +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

        +1 release audit. The applied patch does not increase the total number of release audit warnings.

        +1 core tests. The patch passed unit tests in hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs.

        +1 contrib tests. The patch passed contrib unit tests.

        Test results: https://builds.apache.org/job/PreCommit-HDFS-Build/2400//testReport/
        Console output: https://builds.apache.org/job/PreCommit-HDFS-Build/2400//console

        This message is automatically generated.

        Show
        Hadoop QA added a comment - -1 overall. Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12526302/HDFS-3395.patch against trunk revision . +1 @author. The patch does not contain any @author tags. -1 tests included. The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. +1 javadoc. The javadoc tool did not generate any warning messages. +1 javac. The applied patch does not increase the total number of javac compiler warnings. +1 eclipse:eclipse. The patch built with eclipse:eclipse. +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit. The applied patch does not increase the total number of release audit warnings. +1 core tests. The patch passed unit tests in hadoop-common-project/hadoop-common hadoop-hdfs-project/hadoop-hdfs. +1 contrib tests. The patch passed contrib unit tests. Test results: https://builds.apache.org/job/PreCommit-HDFS-Build/2400//testReport/ Console output: https://builds.apache.org/job/PreCommit-HDFS-Build/2400//console This message is automatically generated.
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Hdfs-trunk-Commit #2296 (See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/2296/)
        HDFS-3395. NN doesn't start with HA+security enabled and HTTP address set to 0.0.0.0. Contributed by Aaron T. Myers. (Revision 1336690)

        Result = SUCCESS
        atm : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1336690
        Files :

        • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/net/NetUtils.java
        • /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
        • /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/DFSUtil.java
        Show
        Hudson added a comment - Integrated in Hadoop-Hdfs-trunk-Commit #2296 (See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/2296/ ) HDFS-3395 . NN doesn't start with HA+security enabled and HTTP address set to 0.0.0.0. Contributed by Aaron T. Myers. (Revision 1336690) Result = SUCCESS atm : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1336690 Files : /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/net/NetUtils.java /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/DFSUtil.java
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Common-trunk-Commit #2221 (See https://builds.apache.org/job/Hadoop-Common-trunk-Commit/2221/)
        HDFS-3395. NN doesn't start with HA+security enabled and HTTP address set to 0.0.0.0. Contributed by Aaron T. Myers. (Revision 1336690)

        Result = SUCCESS
        atm : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1336690
        Files :

        • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/net/NetUtils.java
        • /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
        • /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/DFSUtil.java
        Show
        Hudson added a comment - Integrated in Hadoop-Common-trunk-Commit #2221 (See https://builds.apache.org/job/Hadoop-Common-trunk-Commit/2221/ ) HDFS-3395 . NN doesn't start with HA+security enabled and HTTP address set to 0.0.0.0. Contributed by Aaron T. Myers. (Revision 1336690) Result = SUCCESS atm : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1336690 Files : /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/net/NetUtils.java /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/DFSUtil.java
        Hide
        Aaron T. Myers added a comment -

        Thanks a lot for the review, Eli. I've just committed this to trunk, branch-2, and branch-2.0.0-alpha.

        Show
        Aaron T. Myers added a comment - Thanks a lot for the review, Eli. I've just committed this to trunk, branch-2, and branch-2.0.0-alpha.
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Mapreduce-trunk-Commit #2238 (See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Commit/2238/)
        HDFS-3395. NN doesn't start with HA+security enabled and HTTP address set to 0.0.0.0. Contributed by Aaron T. Myers. (Revision 1336690)

        Result = ABORTED
        atm : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1336690
        Files :

        • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/net/NetUtils.java
        • /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
        • /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/DFSUtil.java
        Show
        Hudson added a comment - Integrated in Hadoop-Mapreduce-trunk-Commit #2238 (See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Commit/2238/ ) HDFS-3395 . NN doesn't start with HA+security enabled and HTTP address set to 0.0.0.0. Contributed by Aaron T. Myers. (Revision 1336690) Result = ABORTED atm : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1336690 Files : /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/net/NetUtils.java /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/DFSUtil.java
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Hdfs-trunk #1040 (See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1040/)
        HDFS-3395. NN doesn't start with HA+security enabled and HTTP address set to 0.0.0.0. Contributed by Aaron T. Myers. (Revision 1336690)

        Result = FAILURE
        atm : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1336690
        Files :

        • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/net/NetUtils.java
        • /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
        • /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/DFSUtil.java
        Show
        Hudson added a comment - Integrated in Hadoop-Hdfs-trunk #1040 (See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1040/ ) HDFS-3395 . NN doesn't start with HA+security enabled and HTTP address set to 0.0.0.0. Contributed by Aaron T. Myers. (Revision 1336690) Result = FAILURE atm : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1336690 Files : /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/net/NetUtils.java /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/DFSUtil.java
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Mapreduce-trunk #1076 (See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/1076/)
        HDFS-3395. NN doesn't start with HA+security enabled and HTTP address set to 0.0.0.0. Contributed by Aaron T. Myers. (Revision 1336690)

        Result = SUCCESS
        atm : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1336690
        Files :

        • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/net/NetUtils.java
        • /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt
        • /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/DFSUtil.java
        Show
        Hudson added a comment - Integrated in Hadoop-Mapreduce-trunk #1076 (See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/1076/ ) HDFS-3395 . NN doesn't start with HA+security enabled and HTTP address set to 0.0.0.0. Contributed by Aaron T. Myers. (Revision 1336690) Result = SUCCESS atm : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1336690 Files : /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/net/NetUtils.java /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt /hadoop/common/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/DFSUtil.java
        Hide
        Daryn Sharp added a comment -

        This really should have used NetUtils.createSocketAddrForHost(host, port) instead of string appending the port. Also, would it work correctly if the config was updated with the connect address for the socket? There's a Configuration#updateConnectAddr(key, addr) that will do that.

        Show
        Daryn Sharp added a comment - This really should have used NetUtils.createSocketAddrForHost(host, port) instead of string appending the port. Also, would it work correctly if the config was updated with the connect address for the socket? There's a Configuration#updateConnectAddr(key, addr) that will do that.
        Hide
        Aaron T. Myers added a comment -

        This really should have used NetUtils.createSocketAddrForHost(host, port) instead of string appending the port.

        Good point, though this particular use of it is pretty innocuous, as the patch is just using a dummy port value (0) so we can get an InetSocketAddress on which we can call isAnyLocalAddress().

        Also, would it work correctly if the config was updated with the connect address for the socket? There's a Configuration#updateConnectAddr(key, addr) that will do that.

        I'm not sure I understand this. What config are you referring to updating?

        Show
        Aaron T. Myers added a comment - This really should have used NetUtils.createSocketAddrForHost(host, port) instead of string appending the port. Good point, though this particular use of it is pretty innocuous, as the patch is just using a dummy port value (0) so we can get an InetSocketAddress on which we can call isAnyLocalAddress(). Also, would it work correctly if the config was updated with the connect address for the socket? There's a Configuration#updateConnectAddr(key, addr) that will do that. I'm not sure I understand this. What config are you referring to updating?
        Hide
        Daryn Sharp added a comment -

        True, appending the port doesn't hurt anything. I was just adding an FYI because it causes a fair amount of unnecessary work to process the host+port tuple compared to passing each as an arg.

        I looked all the places where the substitute method is used, so scratch my comment about updating the config. I wonder though... Since a default of a wildcard address is permissible if security is off, does this change introduce inconsistent behavior when security is enabled? Kerberos requires a hostname, so should it just sub in the resolved host name (ex. NetUtils.getConnectAddres(addr).getHostName()) if the default is a wildcard and security is enabled?

        Show
        Daryn Sharp added a comment - True, appending the port doesn't hurt anything. I was just adding an FYI because it causes a fair amount of unnecessary work to process the host+port tuple compared to passing each as an arg. I looked all the places where the substitute method is used, so scratch my comment about updating the config. I wonder though... Since a default of a wildcard address is permissible if security is off, does this change introduce inconsistent behavior when security is enabled? Kerberos requires a hostname, so should it just sub in the resolved host name (ex. NetUtils.getConnectAddres(addr).getHostName() ) if the default is a wildcard and security is enabled?

          People

          • Assignee:
            Aaron T. Myers
            Reporter:
            Aaron T. Myers
          • Votes:
            0 Vote for this issue
            Watchers:
            6 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development