Uploaded image for project: 'Hadoop Common'
  1. Hadoop Common
  2. HADOOP-10221

Add a plugin to specify SaslProperties for RPC protocol based on connection properties

    Details

    • Type: Improvement
    • Status: Closed
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 2.2.0
    • Fix Version/s: 2.4.0
    • Component/s: security
    • Labels:
      None
    • Target Version/s:
    • Hadoop Flags:
      Reviewed
    • Release Note:
      Hide
      SaslPropertiesResolver or its subclass is used to resolve the QOP used for a connection. The subclass can be specified via "hadoop.security.saslproperties.resolver.class" configuration property. If not specified, the full set of values specified in hadoop.rpc.protection is used while determining the QOP used for the connection. If a class is specified, then the QOP values returned by the class will be used while determining the QOP used for the connection.

      Note that this change, effectively removes SaslRpcServer.SASL_PROPS which was a public field. Any use of this variable should be replaced with the following code:
      SaslPropertiesResolver saslPropsResolver = SaslPropertiesResolver.getInstance(conf);
      Map<String, String> sasl_props = saslPropsResolver.getDefaultProperties();
      Show
      SaslPropertiesResolver or its subclass is used to resolve the QOP used for a connection. The subclass can be specified via "hadoop.security.saslproperties.resolver.class" configuration property. If not specified, the full set of values specified in hadoop.rpc.protection is used while determining the QOP used for the connection. If a class is specified, then the QOP values returned by the class will be used while determining the QOP used for the connection. Note that this change, effectively removes SaslRpcServer.SASL_PROPS which was a public field. Any use of this variable should be replaced with the following code: SaslPropertiesResolver saslPropsResolver = SaslPropertiesResolver.getInstance(conf); Map<String, String> sasl_props = saslPropsResolver.getDefaultProperties();

      Description

      Add a plugin to specify SaslProperties for RPC protocol based on connection properties.

      HADOOP-10211 enables client and server to specify and support multiple QOP. Some connections needs to be restricted to a specific set of QOP based on connection properties.
      Eg. connections from client from a specific subnet needs to be encrypted (QOP=privacy)

      1. HADOOP-10221.patch
        18 kB
        Benoy Antony
      2. HADOOP-10221.patch
        18 kB
        Benoy Antony
      3. HADOOP-10221.patch
        19 kB
        Benoy Antony
      4. HADOOP-10221.patch
        19 kB
        Benoy Antony
      5. HADOOP-10221.patch
        19 kB
        Benoy Antony
      6. HADOOP-10221.no-static.example
        11 kB
        Daryn Sharp
      7. HADOOP-10221.patch
        17 kB
        Benoy Antony
      8. HADOOP-10221.patch
        18 kB
        Benoy Antony
      9. HADOOP-10221.patch
        18 kB
        Benoy Antony
      10. HADOOP-10221.patch
        16 kB
        Benoy Antony
      11. HADOOP-10221.patch
        16 kB
        Benoy Antony
      12. HADOOP-10221.patch
        15 kB
        Benoy Antony

        Issue Links

          Activity

          Hide
          benoyantony Benoy Antony added a comment -

          Solution could be to allow SaslProperties (which includes QOP) to be decided for each connection based on custom logic that can be plugged in.
          SASLRPCServer obtains SASLProperties from a configured SASLPropertiesResolver if one exists. If not , it falls backs to the default SASLProperites loaded during initialization.

          The feature is backward compatible.
          Additional tests are enabled in TestSaslRPC

          Show
          benoyantony Benoy Antony added a comment - Solution could be to allow SaslProperties (which includes QOP) to be decided for each connection based on custom logic that can be plugged in. SASLRPCServer obtains SASLProperties from a configured SASLPropertiesResolver if one exists. If not , it falls backs to the default SASLProperites loaded during initialization. The feature is backward compatible. Additional tests are enabled in TestSaslRPC
          Hide
          benoyantony Benoy Antony added a comment -

          Adding comments to SaslPropertiesResolver

          Show
          benoyantony Benoy Antony added a comment - Adding comments to SaslPropertiesResolver
          Hide
          daryn Daryn Sharp added a comment -

          Like the idea. Again something I've been meaning to do because QOP isn't practical - all or nothing. Will try to review today/tomorrow.

          Show
          daryn Daryn Sharp added a comment - Like the idea. Again something I've been meaning to do because QOP isn't practical - all or nothing. Will try to review today/tomorrow.
          Hide
          sureshms Suresh Srinivas added a comment -

          Daryn Sharp, would you be able to review this? It would be great to get this done quickly and make it available in 2.4 release.

          Show
          sureshms Suresh Srinivas added a comment - Daryn Sharp , would you be able to review this? It would be great to get this done quickly and make it available in 2.4 release.
          Hide
          daryn Daryn Sharp added a comment -

          Yes, looking at it.

          Show
          daryn Daryn Sharp added a comment - Yes, looking at it.
          Hide
          daryn Daryn Sharp added a comment -

          Still pondering, but Connection#attemptingUser is not defined until after SASL negotiation is in progress. I'm not sure why it's being passed to the resolver that will be used to create the SASL object because won't it always be null?

          Show
          daryn Daryn Sharp added a comment - Still pondering, but Connection#attemptingUser is not defined until after SASL negotiation is in progress. I'm not sure why it's being passed to the resolver that will be used to create the SASL object because won't it always be null?
          Hide
          daryn Daryn Sharp added a comment -

          I think this makes subtle thread safety issues a bit too easy for the unsuspecting implementor. In the sense that it continues to rely on the global static SaslRpcServer.SASL_PROPS. If a resolver twiddles any of the values it will may unduly influence other connections.

          Show
          daryn Daryn Sharp added a comment - I think this makes subtle thread safety issues a bit too easy for the unsuspecting implementor. In the sense that it continues to rely on the global static SaslRpcServer.SASL_PROPS. If a resolver twiddles any of the values it will may unduly influence other connections.
          Hide
          daryn Daryn Sharp added a comment -

          Bad memories are flooding back. The static nature of the QOPs is a big pre-existing problem. If we are going to expose a publicly supported interface, we need to avoid locking-in the current bad behavior. Fixing it later will likely cause incompatibilities.

          The static nature of the QOP causes servers to step all over each other's required settings. One RPC server can accidentally upgrade or downgrade the protections that another service intended. One resolver might be surprised another resolver squished the default values, etc. A daemon with RPC servers may be shocked to find that a RPC client re-init the static QOP too - which may effectively disable the QOP settings the server(s) expected!

          Show
          daryn Daryn Sharp added a comment - Bad memories are flooding back. The static nature of the QOPs is a big pre-existing problem. If we are going to expose a publicly supported interface, we need to avoid locking-in the current bad behavior. Fixing it later will likely cause incompatibilities. The static nature of the QOP causes servers to step all over each other's required settings. One RPC server can accidentally upgrade or downgrade the protections that another service intended. One resolver might be surprised another resolver squished the default values, etc. A daemon with RPC servers may be shocked to find that a RPC client re-init the static QOP too - which may effectively disable the QOP settings the server(s) expected!
          Hide
          benoyantony Benoy Antony added a comment -

          Thanks Daryn Sharp for the review.

          SaslPropertiesResolver.java
          Map<String, String> resolve (Map<String, String> properties, 
                ConnectionContext ctxt);
          

          1. To make sure that input is not modified, I can pass an unmodifiable properties.
          2. ConnectionContext with ugi and ipasddress was introduced because of an internal review. I was using only the ip address to make the decison. I can remove the UGI field from ConnectionContext.

          Show
          benoyantony Benoy Antony added a comment - Thanks Daryn Sharp for the review. SaslPropertiesResolver.java Map< String , String > resolve (Map< String , String > properties, ConnectionContext ctxt); 1. To make sure that input is not modified, I can pass an unmodifiable properties. 2. ConnectionContext with ugi and ipasddress was introduced because of an internal review. I was using only the ip address to make the decison. I can remove the UGI field from ConnectionContext.
          Hide
          benoyantony Benoy Antony added a comment -

          New patch which addresses the points noted above.

          Show
          benoyantony Benoy Antony added a comment - New patch which addresses the points noted above.
          Hide
          chrilisf Chris Li added a comment -

          Hi Benoy Antony

          I'm trying to understand the issues raised. A couple questions:

          1. Is there anything to gain for a user with malicious intent to modify the global static SaslRpcServer.SASL_PROPS?
          2. Is it worth being paranoid and copying the map before passing it to an implementer of SaslPropertiesResolver, since any non-malicious user will do it anyways?

          Show
          chrilisf Chris Li added a comment - Hi Benoy Antony I'm trying to understand the issues raised. A couple questions: 1. Is there anything to gain for a user with malicious intent to modify the global static SaslRpcServer.SASL_PROPS? 2. Is it worth being paranoid and copying the map before passing it to an implementer of SaslPropertiesResolver, since any non-malicious user will do it anyways?
          Hide
          benoyantony Benoy Antony added a comment -

          Chris, I believe the concern is more with the accidental modification of the properties by the implementer of SaslPropertiesResolver.

          Show
          benoyantony Benoy Antony added a comment - Chris, I believe the concern is more with the accidental modification of the properties by the implementer of SaslPropertiesResolver.
          Hide
          benoyantony Benoy Antony added a comment -

          Attaching a new patch with a single change:

          Add a debug log in org.apache.hadoop.ipc.Client to log the negotiated qop.

          Show
          benoyantony Benoy Antony added a comment - Attaching a new patch with a single change: Add a debug log in org.apache.hadoop.ipc.Client to log the negotiated qop.
          Hide
          arpitagarwal Arpit Agarwal added a comment -

          My comments from reviewing the patch.

          1. The same CONSTRUCTOR_CACHE is used for both config and parameter-less constructor. Won’t you get a RuntimeException if both methods are used for the same class? Use separate cache?
          2. Javadoc for constructInstance is confusing. Reword as ‘Create an object passing the configuration to the constructor itself’?
          3. Add a comment to the description of hadoop.rpc.protection stating that hadoop.security.saslproperties.resolver.class can override it?
          4. ConnectionContext constructor has unnecessary super()?
          5. Nice use of parametrization to add a test case. AuthSaslPropertiesResolver appears unused outside of tests, can we make it a static nested class of TestSaslRPC?
          6. Nitpick: Extra space in resolve ( in SaslPropertiesResolver.java.
          7. Same in SaslRpcServer.java: resolver = SaslUtil.getResolver (conf);.
          8. Extra space in SaslUtil.java: public static Map. Also static boolean shouldEncrypt. Please follow the coding convention.
          9. Nitpick: Redundant cast at SaslUtil.java:49?

          Also the patch needs to be rebased (conflict in CommonConfigurationKeysPublic).

          Show
          arpitagarwal Arpit Agarwal added a comment - My comments from reviewing the patch. The same CONSTRUCTOR_CACHE is used for both config and parameter-less constructor. Won’t you get a RuntimeException if both methods are used for the same class? Use separate cache? Javadoc for constructInstance is confusing. Reword as ‘Create an object passing the configuration to the constructor itself’? Add a comment to the description of hadoop.rpc.protection stating that hadoop.security.saslproperties.resolver.class can override it? ConnectionContext constructor has unnecessary super() ? Nice use of parametrization to add a test case. AuthSaslPropertiesResolver appears unused outside of tests, can we make it a static nested class of TestSaslRPC ? Nitpick: Extra space in resolve ( in SaslPropertiesResolver.java. Same in SaslRpcServer.java: resolver = SaslUtil.getResolver (conf); . Extra space in SaslUtil.java: public static Map . Also static boolean shouldEncrypt . Please follow the coding convention. Nitpick: Redundant cast at SaslUtil.java:49? Also the patch needs to be rebased (conflict in CommonConfigurationKeysPublic).
          Hide
          arpitagarwal Arpit Agarwal added a comment -

          Also thanks for already addressing earlier comments from others!

          Show
          arpitagarwal Arpit Agarwal added a comment - Also thanks for already addressing earlier comments from others!
          Hide
          hadoopqa Hadoop QA added a comment -

          -1 overall. Here are the results of testing the latest attachment
          http://issues.apache.org/jira/secure/attachment/12632973/HADOOP-10221.patch
          against trunk revision .

          -1 patch. The patch command could not apply the patch.

          Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/3632//console

          This message is automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - -1 overall . Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12632973/HADOOP-10221.patch against trunk revision . -1 patch . The patch command could not apply the patch. Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/3632//console This message is automatically generated.
          Hide
          benoyantony Benoy Antony added a comment -

          Thanks for the detailed review and comments, Arpit Agarwal . Attaching the new patch which takes care of the comments.

          Show
          benoyantony Benoy Antony added a comment - Thanks for the detailed review and comments, Arpit Agarwal . Attaching the new patch which takes care of the comments.
          Hide
          arpitagarwal Arpit Agarwal added a comment -

          Thank you for addressing the feedback Benoy Antony.

          +1 for the latest patch pending Jenkins.

          Show
          arpitagarwal Arpit Agarwal added a comment - Thank you for addressing the feedback Benoy Antony . +1 for the latest patch pending Jenkins.
          Hide
          daryn Daryn Sharp added a comment -

          Please let me review the latest patch.

          Show
          daryn Daryn Sharp added a comment - Please let me review the latest patch.
          Hide
          arpitagarwal Arpit Agarwal added a comment -

          I will hold off on committing it.

          Show
          arpitagarwal Arpit Agarwal added a comment - I will hold off on committing it.
          Hide
          daryn Daryn Sharp added a comment -

          It's still very static in nature, and the shouldEncrypt method is assuming QOP is a single value whereas QOP now supports multiple values. I'll followup again shortly.

          Show
          daryn Daryn Sharp added a comment - It's still very static in nature, and the shouldEncrypt method is assuming QOP is a single value whereas QOP now supports multiple values. I'll followup again shortly.
          Hide
          benoyantony Benoy Antony added a comment -

          Daryn Sharp The shouldEncrypt method is not used in the current patch. It is added to decide whether to encrypt the HDFS data transfer or not. I can remove it from this patch if it is causing confusion.

          Show
          benoyantony Benoy Antony added a comment - Daryn Sharp The shouldEncrypt method is not used in the current patch. It is added to decide whether to encrypt the HDFS data transfer or not. I can remove it from this patch if it is causing confusion.
          Hide
          benoyantony Benoy Antony added a comment -

          Note that HDFS data transfer supports

          {encrypted, plain}
          Show
          benoyantony Benoy Antony added a comment - Note that HDFS data transfer supports {encrypted, plain}
          Hide
          benoyantony Benoy Antony added a comment -

          Attaching the patch which removes the unused method - shouldEncrypt

          Show
          benoyantony Benoy Antony added a comment - Attaching the patch which removes the unused method - shouldEncrypt
          Hide
          hadoopqa Hadoop QA added a comment -

          -1 overall. Here are the results of testing the latest attachment
          http://issues.apache.org/jira/secure/attachment/12633210/HADOOP-10221.patch
          against trunk revision .

          +1 @author. The patch does not contain any @author tags.

          +1 tests included. The patch appears to include 1 new or modified test files.

          +1 javac. The applied patch does not increase the total number of javac compiler warnings.

          -1 javadoc. The javadoc tool appears to have generated 3 warning messages.
          See https://builds.apache.org/job/PreCommit-HADOOP-Build/3637//artifact/trunk/patchprocess/diffJavadocWarnings.txt for details.

          +1 eclipse:eclipse. The patch built with eclipse:eclipse.

          +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

          +1 release audit. The applied patch does not increase the total number of release audit warnings.

          +1 core tests. The patch passed unit tests in hadoop-common-project/hadoop-common.

          +1 contrib tests. The patch passed contrib unit tests.

          Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/3637//testReport/
          Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/3637//console

          This message is automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - -1 overall . Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12633210/HADOOP-10221.patch against trunk revision . +1 @author . The patch does not contain any @author tags. +1 tests included . The patch appears to include 1 new or modified test files. +1 javac . The applied patch does not increase the total number of javac compiler warnings. -1 javadoc . The javadoc tool appears to have generated 3 warning messages. See https://builds.apache.org/job/PreCommit-HADOOP-Build/3637//artifact/trunk/patchprocess/diffJavadocWarnings.txt for details. +1 eclipse:eclipse . The patch built with eclipse:eclipse. +1 findbugs . The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit . The applied patch does not increase the total number of release audit warnings. +1 core tests . The patch passed unit tests in hadoop-common-project/hadoop-common. +1 contrib tests . The patch passed contrib unit tests. Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/3637//testReport/ Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/3637//console This message is automatically generated.
          Hide
          hadoopqa Hadoop QA added a comment -

          -1 overall. Here are the results of testing the latest attachment
          http://issues.apache.org/jira/secure/attachment/12633225/HADOOP-10221.patch
          against trunk revision .

          +1 @author. The patch does not contain any @author tags.

          +1 tests included. The patch appears to include 1 new or modified test files.

          +1 javac. The applied patch does not increase the total number of javac compiler warnings.

          -1 javadoc. The javadoc tool appears to have generated 2 warning messages.
          See https://builds.apache.org/job/PreCommit-HADOOP-Build/3638//artifact/trunk/patchprocess/diffJavadocWarnings.txt for details.

          +1 eclipse:eclipse. The patch built with eclipse:eclipse.

          +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

          +1 release audit. The applied patch does not increase the total number of release audit warnings.

          +1 core tests. The patch passed unit tests in hadoop-common-project/hadoop-common.

          +1 contrib tests. The patch passed contrib unit tests.

          Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/3638//testReport/
          Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/3638//console

          This message is automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - -1 overall . Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12633225/HADOOP-10221.patch against trunk revision . +1 @author . The patch does not contain any @author tags. +1 tests included . The patch appears to include 1 new or modified test files. +1 javac . The applied patch does not increase the total number of javac compiler warnings. -1 javadoc . The javadoc tool appears to have generated 2 warning messages. See https://builds.apache.org/job/PreCommit-HADOOP-Build/3638//artifact/trunk/patchprocess/diffJavadocWarnings.txt for details. +1 eclipse:eclipse . The patch built with eclipse:eclipse. +1 findbugs . The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit . The applied patch does not increase the total number of release audit warnings. +1 core tests . The patch passed unit tests in hadoop-common-project/hadoop-common. +1 contrib tests . The patch passed contrib unit tests. Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/3638//testReport/ Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/3638//console This message is automatically generated.
          Hide
          daryn Daryn Sharp added a comment -

          Sorry this jira's review has been delayed, I was dealing with internal issues.

          Here's an example patch of what I meant in my original review about removing the static nature of qop. Currently its a global configuration whereby multiple rpc instances can stomp on each others configuration.

          This is a completely uncompiled & untested example patch. Hopefully we can kill two birds with one stone: sasl property configuration is pluggable, and rpc servers may each manage their own sasl properties. Oh, and I added your resolver concept to the rpc client too.

          Show
          daryn Daryn Sharp added a comment - Sorry this jira's review has been delayed, I was dealing with internal issues. Here's an example patch of what I meant in my original review about removing the static nature of qop. Currently its a global configuration whereby multiple rpc instances can stomp on each others configuration. This is a completely uncompiled & untested example patch. Hopefully we can kill two birds with one stone: sasl property configuration is pluggable, and rpc servers may each manage their own sasl properties. Oh, and I added your resolver concept to the rpc client too.
          Hide
          benoyantony Benoy Antony added a comment -

          Thanks for the code, Daryn Sharp.
          To make sure that I understood the logic correctly, the patch does the following:

          • SaslPropertiesResolver is used to return the SaslProperties.
          • The default implementation is to return the list of qops read from hadoop.rpc.protection
          • If needed, one can override by providing another implementation.
          Show
          benoyantony Benoy Antony added a comment - Thanks for the code, Daryn Sharp . To make sure that I understood the logic correctly, the patch does the following: SaslPropertiesResolver is used to return the SaslProperties. The default implementation is to return the list of qops read from hadoop.rpc.protection If needed, one can override by providing another implementation.
          Hide
          benoyantony Benoy Antony added a comment -

          I like this approach and is incorporating it . Will post a patch after testing it.

          Show
          benoyantony Benoy Antony added a comment - I like this approach and is incorporating it . Will post a patch after testing it.
          Hide
          hadoopqa Hadoop QA added a comment -

          -1 overall. Here are the results of testing the latest attachment
          http://issues.apache.org/jira/secure/attachment/12633245/HADOOP-10221.no-static.example
          against trunk revision .

          +1 @author. The patch does not contain any @author tags.

          -1 tests included. The patch doesn't appear to include any new or modified tests.
          Please justify why no new tests are needed for this patch.
          Also please list what manual steps were performed to verify this patch.

          +1 javac. The applied patch does not increase the total number of javac compiler warnings.

          -1 javadoc. The javadoc tool appears to have generated 2 warning messages.
          See https://builds.apache.org/job/PreCommit-HADOOP-Build/3639//artifact/trunk/patchprocess/diffJavadocWarnings.txt for details.

          +1 eclipse:eclipse. The patch built with eclipse:eclipse.

          +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

          +1 release audit. The applied patch does not increase the total number of release audit warnings.

          -1 core tests. The patch failed these unit tests in hadoop-common-project/hadoop-common:

          org.apache.hadoop.ipc.TestRPC
          org.apache.hadoop.ipc.TestMiniRPCBenchmark
          org.apache.hadoop.security.TestDoAsEffectiveUser
          org.apache.hadoop.ipc.TestSaslRPC

          +1 contrib tests. The patch passed contrib unit tests.

          Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/3639//testReport/
          Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/3639//console

          This message is automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - -1 overall . Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12633245/HADOOP-10221.no-static.example against trunk revision . +1 @author . The patch does not contain any @author tags. -1 tests included . The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. +1 javac . The applied patch does not increase the total number of javac compiler warnings. -1 javadoc . The javadoc tool appears to have generated 2 warning messages. See https://builds.apache.org/job/PreCommit-HADOOP-Build/3639//artifact/trunk/patchprocess/diffJavadocWarnings.txt for details. +1 eclipse:eclipse . The patch built with eclipse:eclipse. +1 findbugs . The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit . The applied patch does not increase the total number of release audit warnings. -1 core tests . The patch failed these unit tests in hadoop-common-project/hadoop-common: org.apache.hadoop.ipc.TestRPC org.apache.hadoop.ipc.TestMiniRPCBenchmark org.apache.hadoop.security.TestDoAsEffectiveUser org.apache.hadoop.ipc.TestSaslRPC +1 contrib tests . The patch passed contrib unit tests. Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/3639//testReport/ Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/3639//console This message is automatically generated.
          Hide
          benoyantony Benoy Antony added a comment -

          Attaching the patch based on Daryn Sharp's patch.
          Tested with and without custom implementation on a test cluster.
          Ran javadoc and fixed javadoc warnings.

          Show
          benoyantony Benoy Antony added a comment - Attaching the patch based on Daryn Sharp 's patch. Tested with and without custom implementation on a test cluster. Ran javadoc and fixed javadoc warnings.
          Hide
          benoyantony Benoy Antony added a comment -

          Attaching the patch which fixes whitespace and hardcoded property name in TestSaslRpc.

          Show
          benoyantony Benoy Antony added a comment - Attaching the patch which fixes whitespace and hardcoded property name in TestSaslRpc.
          Hide
          hadoopqa Hadoop QA added a comment -

          +1 overall. Here are the results of testing the latest attachment
          http://issues.apache.org/jira/secure/attachment/12633454/HADOOP-10221.patch
          against trunk revision .

          +1 @author. The patch does not contain any @author tags.

          +1 tests included. The patch appears to include 1 new or modified test files.

          +1 javac. The applied patch does not increase the total number of javac compiler warnings.

          +1 javadoc. There were no new javadoc warning messages.

          +1 eclipse:eclipse. The patch built with eclipse:eclipse.

          +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

          +1 release audit. The applied patch does not increase the total number of release audit warnings.

          +1 core tests. The patch passed unit tests in hadoop-common-project/hadoop-common.

          +1 contrib tests. The patch passed contrib unit tests.

          Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/3642//testReport/
          Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/3642//console

          This message is automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - +1 overall . Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12633454/HADOOP-10221.patch against trunk revision . +1 @author . The patch does not contain any @author tags. +1 tests included . The patch appears to include 1 new or modified test files. +1 javac . The applied patch does not increase the total number of javac compiler warnings. +1 javadoc . There were no new javadoc warning messages. +1 eclipse:eclipse . The patch built with eclipse:eclipse. +1 findbugs . The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit . The applied patch does not increase the total number of release audit warnings. +1 core tests . The patch passed unit tests in hadoop-common-project/hadoop-common. +1 contrib tests . The patch passed contrib unit tests. Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/3642//testReport/ Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/3642//console This message is automatically generated.
          Hide
          hadoopqa Hadoop QA added a comment -

          +1 overall. Here are the results of testing the latest attachment
          http://issues.apache.org/jira/secure/attachment/12633456/HADOOP-10221.patch
          against trunk revision .

          +1 @author. The patch does not contain any @author tags.

          +1 tests included. The patch appears to include 1 new or modified test files.

          +1 javac. The applied patch does not increase the total number of javac compiler warnings.

          +1 javadoc. There were no new javadoc warning messages.

          +1 eclipse:eclipse. The patch built with eclipse:eclipse.

          +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

          +1 release audit. The applied patch does not increase the total number of release audit warnings.

          +1 core tests. The patch passed unit tests in hadoop-common-project/hadoop-common.

          +1 contrib tests. The patch passed contrib unit tests.

          Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/3643//testReport/
          Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/3643//console

          This message is automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - +1 overall . Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12633456/HADOOP-10221.patch against trunk revision . +1 @author . The patch does not contain any @author tags. +1 tests included . The patch appears to include 1 new or modified test files. +1 javac . The applied patch does not increase the total number of javac compiler warnings. +1 javadoc . There were no new javadoc warning messages. +1 eclipse:eclipse . The patch built with eclipse:eclipse. +1 findbugs . The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit . The applied patch does not increase the total number of release audit warnings. +1 core tests . The patch passed unit tests in hadoop-common-project/hadoop-common. +1 contrib tests . The patch passed contrib unit tests. Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/3643//testReport/ Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/3643//console This message is automatically generated.
          Hide
          daryn Daryn Sharp added a comment -

          Minor nit in the tests, saslPopertiesResolver is missing an "r" in Properties.

          Would you please elaborate on the difference in the existing ReflectionUtils.newInstance vs. the new ReflectionsUtil.constructInstance? The latter appears to not require the resolver class to extend Configured. The method mentions "... enables one to achieve achieve thread safety using final fields" so I must be missing something subtle. Also note the double "achieve achieve".

          Show
          daryn Daryn Sharp added a comment - Minor nit in the tests, saslPopertiesResolver is missing an "r" in Properties. Would you please elaborate on the difference in the existing ReflectionUtils.newInstance vs. the new ReflectionsUtil.constructInstance ? The latter appears to not require the resolver class to extend Configured . The method mentions "... enables one to achieve achieve thread safety using final fields" so I must be missing something subtle. Also note the double "achieve achieve".
          Hide
          benoyantony Benoy Antony added a comment -

          Attaching the patch which fixes the problems reported by Daryn Sharp.

          The RefelctionUtils newInstance calls the constructor first and then calls setConf on the constructed object. I can initialize a field based on configuration in setConf. But I cannot make it a final field since a final field can be initialized only inside a constructor. So I cannot take advantage of the thread safety offered by the final field when using newInstance.

          That's why I added constructInstance which accepts the configuration object and passes it to the constructor of the object being created.

          Show
          benoyantony Benoy Antony added a comment - Attaching the patch which fixes the problems reported by Daryn Sharp . The RefelctionUtils newInstance calls the constructor first and then calls setConf on the constructed object. I can initialize a field based on configuration in setConf. But I cannot make it a final field since a final field can be initialized only inside a constructor. So I cannot take advantage of the thread safety offered by the final field when using newInstance. That's why I added constructInstance which accepts the configuration object and passes it to the constructor of the object being created.
          Hide
          hadoopqa Hadoop QA added a comment -

          +1 overall. Here are the results of testing the latest attachment
          http://issues.apache.org/jira/secure/attachment/12633758/HADOOP-10221.patch
          against trunk revision .

          +1 @author. The patch does not contain any @author tags.

          +1 tests included. The patch appears to include 1 new or modified test files.

          +1 javac. The applied patch does not increase the total number of javac compiler warnings.

          +1 javadoc. There were no new javadoc warning messages.

          +1 eclipse:eclipse. The patch built with eclipse:eclipse.

          +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

          +1 release audit. The applied patch does not increase the total number of release audit warnings.

          +1 core tests. The patch passed unit tests in hadoop-common-project/hadoop-common.

          +1 contrib tests. The patch passed contrib unit tests.

          Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/3652//testReport/
          Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/3652//console

          This message is automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - +1 overall . Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12633758/HADOOP-10221.patch against trunk revision . +1 @author . The patch does not contain any @author tags. +1 tests included . The patch appears to include 1 new or modified test files. +1 javac . The applied patch does not increase the total number of javac compiler warnings. +1 javadoc . There were no new javadoc warning messages. +1 eclipse:eclipse . The patch built with eclipse:eclipse. +1 findbugs . The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit . The applied patch does not increase the total number of release audit warnings. +1 core tests . The patch passed unit tests in hadoop-common-project/hadoop-common. +1 contrib tests . The patch passed contrib unit tests. Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/3652//testReport/ Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/3652//console This message is automatically generated.
          Hide
          daryn Daryn Sharp added a comment -

          I'd be inclined to think that the current reflection util is "good enough" for the rest of hadoop, so is it maybe a bit of overkill here? Allowing a separate setConf may offer the future possibility of refreshing the config w/o bouncing the server.

          Show
          daryn Daryn Sharp added a comment - I'd be inclined to think that the current reflection util is "good enough" for the rest of hadoop, so is it maybe a bit of overkill here? Allowing a separate setConf may offer the future possibility of refreshing the config w/o bouncing the server.
          Hide
          benoyantony Benoy Antony added a comment -

          Thanks for the comment and explanation , Daryn Sharp.

          Attaching the new patch.
          This patch removes the changes in ReflectionUtils.java.
          The SaslPropertiesResolver.java get constructed via ReflectionUtils.newInstance(conf) .

          Show
          benoyantony Benoy Antony added a comment - Thanks for the comment and explanation , Daryn Sharp . Attaching the new patch. This patch removes the changes in ReflectionUtils.java. The SaslPropertiesResolver.java get constructed via ReflectionUtils.newInstance(conf) .
          Hide
          hadoopqa Hadoop QA added a comment -

          +1 overall. Here are the results of testing the latest attachment
          http://issues.apache.org/jira/secure/attachment/12635349/HADOOP-10221.patch
          against trunk revision .

          +1 @author. The patch does not contain any @author tags.

          +1 tests included. The patch appears to include 1 new or modified test files.

          +1 javac. The applied patch does not increase the total number of javac compiler warnings.

          +1 javadoc. There were no new javadoc warning messages.

          +1 eclipse:eclipse. The patch built with eclipse:eclipse.

          +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

          +1 release audit. The applied patch does not increase the total number of release audit warnings.

          +1 core tests. The patch passed unit tests in hadoop-common-project/hadoop-common.

          +1 contrib tests. The patch passed contrib unit tests.

          Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/3677//testReport/
          Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/3677//console

          This message is automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - +1 overall . Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12635349/HADOOP-10221.patch against trunk revision . +1 @author . The patch does not contain any @author tags. +1 tests included . The patch appears to include 1 new or modified test files. +1 javac . The applied patch does not increase the total number of javac compiler warnings. +1 javadoc . There were no new javadoc warning messages. +1 eclipse:eclipse . The patch built with eclipse:eclipse. +1 findbugs . The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit . The applied patch does not increase the total number of release audit warnings. +1 core tests . The patch passed unit tests in hadoop-common-project/hadoop-common. +1 contrib tests . The patch passed contrib unit tests. Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/3677//testReport/ Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/3677//console This message is automatically generated.
          Hide
          benoyantony Benoy Antony added a comment -

          Arpit Agarwal, Daryn Sharp , Thanks for helping me with this jira.
          I have tested the patch on trunk in my test cluster and works fine. Please let me know if anything else needs to be done for this.

          Show
          benoyantony Benoy Antony added a comment - Arpit Agarwal , Daryn Sharp , Thanks for helping me with this jira. I have tested the patch on trunk in my test cluster and works fine. Please let me know if anything else needs to be done for this.
          Hide
          arpitagarwal Arpit Agarwal added a comment -

          Benoy, I can take a look at your latest patch.

          However since Daryn has been providing feedback I will hold off committing for now in case he has further comments.

          Show
          arpitagarwal Arpit Agarwal added a comment - Benoy, I can take a look at your latest patch. However since Daryn has been providing feedback I will hold off committing for now in case he has further comments.
          Hide
          benoyantony Benoy Antony added a comment -

          Removed unnecessary statements from Test class. No major changes.

          Show
          benoyantony Benoy Antony added a comment - Removed unnecessary statements from Test class. No major changes.
          Hide
          daryn Daryn Sharp added a comment -

          +1 Since I contributed a chunk of the work, someone else like Arpit should be the determining factor.

          Show
          daryn Daryn Sharp added a comment - +1 Since I contributed a chunk of the work, someone else like Arpit should be the determining factor.
          Hide
          arpitagarwal Arpit Agarwal added a comment -

          Thanks for the confirmation Daryn. Am reviewing the latest patch.

          Show
          arpitagarwal Arpit Agarwal added a comment - Thanks for the confirmation Daryn. Am reviewing the latest patch.
          Hide
          hadoopqa Hadoop QA added a comment -

          +1 overall. Here are the results of testing the latest attachment
          http://issues.apache.org/jira/secure/attachment/12635613/HADOOP-10221.patch
          against trunk revision .

          +1 @author. The patch does not contain any @author tags.

          +1 tests included. The patch appears to include 1 new or modified test files.

          +1 javac. The applied patch does not increase the total number of javac compiler warnings.

          +1 javadoc. There were no new javadoc warning messages.

          +1 eclipse:eclipse. The patch built with eclipse:eclipse.

          +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

          +1 release audit. The applied patch does not increase the total number of release audit warnings.

          +1 core tests. The patch passed unit tests in hadoop-common-project/hadoop-common.

          +1 contrib tests. The patch passed contrib unit tests.

          Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/3683//testReport/
          Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/3683//console

          This message is automatically generated.

          Show
          hadoopqa Hadoop QA added a comment - +1 overall . Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12635613/HADOOP-10221.patch against trunk revision . +1 @author . The patch does not contain any @author tags. +1 tests included . The patch appears to include 1 new or modified test files. +1 javac . The applied patch does not increase the total number of javac compiler warnings. +1 javadoc . There were no new javadoc warning messages. +1 eclipse:eclipse . The patch built with eclipse:eclipse. +1 findbugs . The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit . The applied patch does not increase the total number of release audit warnings. +1 core tests . The patch passed unit tests in hadoop-common-project/hadoop-common. +1 contrib tests . The patch passed contrib unit tests. Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/3683//testReport/ Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/3683//console This message is automatically generated.
          Hide
          arpitagarwal Arpit Agarwal added a comment - - edited

          +1 for the latest patch. I will commit it shortly.

          I took the liberty of making the following trivial edit to avoid spinning up another patch:

          -      return new SaslRpcServer(authMethod).create(this,saslProps,secretManager);
          +      return new SaslRpcServer(authMethod).create(this, saslProps, secretManager);
          
          Show
          arpitagarwal Arpit Agarwal added a comment - - edited +1 for the latest patch. I will commit it shortly. I took the liberty of making the following trivial edit to avoid spinning up another patch: - return new SaslRpcServer(authMethod).create( this ,saslProps,secretManager); + return new SaslRpcServer(authMethod).create( this , saslProps, secretManager);
          Hide
          arpitagarwal Arpit Agarwal added a comment -

          Thanks for the contribution Benoy Antony and Daryn Sharp. I committed this to trunk, branch-2 and branch-2.4.

          Show
          arpitagarwal Arpit Agarwal added a comment - Thanks for the contribution Benoy Antony and Daryn Sharp . I committed this to trunk, branch-2 and branch-2.4.
          Hide
          arpitagarwal Arpit Agarwal added a comment -

          Does this change have any dependencies other than HADOOP-10211? I see TestSaslRPC passes in trunk and branch-2 but fails in branch-2.4:

          testPerConnectionConf[1](org.apache.hadoop.ipc.TestSaslRPC)  Time elapsed: 0.03 sec  <<< ERROR!java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: DIGEST-MD5: No common protection layer between client and server; Host Details : local host is: "Arpit-MB-Pro.local/192.168.0.4"; destination host is: "Arpit-MB-Pro.local":49730;   at com.sun.security.sasl.digest.DigestMD5Client.checkQopSupport(DigestMD5Client.java:396)
            at com.sun.security.sasl.digest.DigestMD5Client.evaluateChallenge(DigestMD5Client.java:208)  at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:407)
          ...
          

          Benoy Antony can you take a look? FWIW I am testing on OS X 10.9.2.

          Show
          arpitagarwal Arpit Agarwal added a comment - Does this change have any dependencies other than HADOOP-10211 ? I see TestSaslRPC passes in trunk and branch-2 but fails in branch-2.4: testPerConnectionConf[1](org.apache.hadoop.ipc.TestSaslRPC) Time elapsed: 0.03 sec <<< ERROR!java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: DIGEST-MD5: No common protection layer between client and server; Host Details : local host is: "Arpit-MB-Pro.local/192.168.0.4" ; destination host is: "Arpit-MB-Pro.local" :49730; at com.sun.security.sasl.digest.DigestMD5Client.checkQopSupport(DigestMD5Client.java:396) at com.sun.security.sasl.digest.DigestMD5Client.evaluateChallenge(DigestMD5Client.java:208) at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:407) ... Benoy Antony can you take a look? FWIW I am testing on OS X 10.9.2.
          Hide
          hudson Hudson added a comment -

          SUCCESS: Integrated in Hadoop-trunk-Commit #5359 (See https://builds.apache.org/job/Hadoop-trunk-Commit/5359/)
          HADOOP-10221. Add file missed in previous checkin, fix typo. (arp: http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1579387)

          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslPropertiesResolver.java
            HADOOP-10221. Add a plugin to specify SaslProperties for RPC protocol based on connection properties. (Contributed by Benoy Antony and Daryn Sharp) (arp: http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1579382)
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/CommonConfigurationKeysPublic.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcClient.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcServer.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/resources/core-default.xml
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestSaslRPC.java
          Show
          hudson Hudson added a comment - SUCCESS: Integrated in Hadoop-trunk-Commit #5359 (See https://builds.apache.org/job/Hadoop-trunk-Commit/5359/ ) HADOOP-10221 . Add file missed in previous checkin, fix typo. (arp: http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1579387 ) /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslPropertiesResolver.java HADOOP-10221 . Add a plugin to specify SaslProperties for RPC protocol based on connection properties. (Contributed by Benoy Antony and Daryn Sharp) (arp: http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1579382 ) /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/CommonConfigurationKeysPublic.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcClient.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcServer.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/resources/core-default.xml /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestSaslRPC.java
          Hide
          benoyantony Benoy Antony added a comment -

          Thanks Arpit Agarwal. I am taking a look at the 2.4 issue.

          Show
          benoyantony Benoy Antony added a comment - Thanks Arpit Agarwal . I am taking a look at the 2.4 issue.
          Hide
          benoyantony Benoy Antony added a comment -

          Arpit Agarwal HADOOP-10070 seems to be required. I merged HADOOP-10070 from trunk to 2.4 locally and the TestSaslRPC passed.
          Direct merge of HADOOP-10070 from trunk to 2.4 failed on one file, but it was easy to fix it manually from the .rej file. If needed, I can attach the patch required to merge HADOOP-10070 to 2.4

          Show
          benoyantony Benoy Antony added a comment - Arpit Agarwal HADOOP-10070 seems to be required. I merged HADOOP-10070 from trunk to 2.4 locally and the TestSaslRPC passed. Direct merge of HADOOP-10070 from trunk to 2.4 failed on one file, but it was easy to fix it manually from the .rej file. If needed, I can attach the patch required to merge HADOOP-10070 to 2.4
          Hide
          arpitagarwal Arpit Agarwal added a comment -

          Thanks for the quick help with the diagnosis Benoy.

          I merged HADOOP-10070 to branch-2.4 and verified it fixes the test failure. The sole merge conflict was in CHANGES.txt which was expected.

          Show
          arpitagarwal Arpit Agarwal added a comment - Thanks for the quick help with the diagnosis Benoy. I merged HADOOP-10070 to branch-2.4 and verified it fixes the test failure. The sole merge conflict was in CHANGES.txt which was expected.
          Hide
          hudson Hudson added a comment -

          FAILURE: Integrated in Hadoop-Yarn-trunk #515 (See https://builds.apache.org/job/Hadoop-Yarn-trunk/515/)
          HADOOP-10221. Add file missed in previous checkin, fix typo. (arp: http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1579387)

          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslPropertiesResolver.java
            HADOOP-10221. Add a plugin to specify SaslProperties for RPC protocol based on connection properties. (Contributed by Benoy Antony and Daryn Sharp) (arp: http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1579382)
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/CommonConfigurationKeysPublic.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcClient.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcServer.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/resources/core-default.xml
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestSaslRPC.java
          Show
          hudson Hudson added a comment - FAILURE: Integrated in Hadoop-Yarn-trunk #515 (See https://builds.apache.org/job/Hadoop-Yarn-trunk/515/ ) HADOOP-10221 . Add file missed in previous checkin, fix typo. (arp: http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1579387 ) /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslPropertiesResolver.java HADOOP-10221 . Add a plugin to specify SaslProperties for RPC protocol based on connection properties. (Contributed by Benoy Antony and Daryn Sharp) (arp: http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1579382 ) /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/CommonConfigurationKeysPublic.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcClient.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcServer.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/resources/core-default.xml /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestSaslRPC.java
          Hide
          hudson Hudson added a comment -

          FAILURE: Integrated in Hadoop-Hdfs-trunk #1707 (See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1707/)
          HADOOP-10221. Add file missed in previous checkin, fix typo. (arp: http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1579387)

          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslPropertiesResolver.java
            HADOOP-10221. Add a plugin to specify SaslProperties for RPC protocol based on connection properties. (Contributed by Benoy Antony and Daryn Sharp) (arp: http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1579382)
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/CommonConfigurationKeysPublic.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcClient.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcServer.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/resources/core-default.xml
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestSaslRPC.java
          Show
          hudson Hudson added a comment - FAILURE: Integrated in Hadoop-Hdfs-trunk #1707 (See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1707/ ) HADOOP-10221 . Add file missed in previous checkin, fix typo. (arp: http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1579387 ) /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslPropertiesResolver.java HADOOP-10221 . Add a plugin to specify SaslProperties for RPC protocol based on connection properties. (Contributed by Benoy Antony and Daryn Sharp) (arp: http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1579382 ) /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/CommonConfigurationKeysPublic.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcClient.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcServer.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/resources/core-default.xml /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestSaslRPC.java
          Hide
          hudson Hudson added a comment -

          SUCCESS: Integrated in Hadoop-Mapreduce-trunk #1732 (See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/1732/)
          HADOOP-10221. Add file missed in previous checkin, fix typo. (arp: http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1579387)

          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslPropertiesResolver.java
            HADOOP-10221. Add a plugin to specify SaslProperties for RPC protocol based on connection properties. (Contributed by Benoy Antony and Daryn Sharp) (arp: http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1579382)
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/CommonConfigurationKeysPublic.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcClient.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcServer.java
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/resources/core-default.xml
          • /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestSaslRPC.java
          Show
          hudson Hudson added a comment - SUCCESS: Integrated in Hadoop-Mapreduce-trunk #1732 (See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/1732/ ) HADOOP-10221 . Add file missed in previous checkin, fix typo. (arp: http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1579387 ) /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslPropertiesResolver.java HADOOP-10221 . Add a plugin to specify SaslProperties for RPC protocol based on connection properties. (Contributed by Benoy Antony and Daryn Sharp) (arp: http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1579382 ) /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/CommonConfigurationKeysPublic.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Client.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/ipc/Server.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcClient.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/security/SaslRpcServer.java /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/resources/core-default.xml /hadoop/common/trunk/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/ipc/TestSaslRPC.java
          Hide
          jdere Jason Dere added a comment -

          Looking at trying to replace usage of SaslRpcServer.SASL_PROPS in the Hive project, but it appears that SaslPropertiesResolver.getDefaultProperties() looks like it has protected scope. Should this method be made public?

          Show
          jdere Jason Dere added a comment - Looking at trying to replace usage of SaslRpcServer.SASL_PROPS in the Hive project, but it appears that SaslPropertiesResolver.getDefaultProperties() looks like it has protected scope. Should this method be made public?
          Hide
          arpitagarwal Arpit Agarwal added a comment -

          Yes I think that will be fine. Can you please file a Jira to do so? Feel free to assign it to me.

          Show
          arpitagarwal Arpit Agarwal added a comment - Yes I think that will be fine. Can you please file a Jira to do so? Feel free to assign it to me.
          Hide
          jdere Jason Dere added a comment -

          Just created HADOOP-10547. I don't have permissions to assign Hadoop Jiras.

          Show
          jdere Jason Dere added a comment - Just created HADOOP-10547 . I don't have permissions to assign Hadoop Jiras.

            People

            • Assignee:
              benoyantony Benoy Antony
              Reporter:
              benoyantony Benoy Antony
            • Votes:
              0 Vote for this issue
              Watchers:
              8 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development