Hadoop Common
  1. Hadoop Common
  2. HADOOP-8476

Remove duplicate VM arguments for hadoop deamon

    Details

    • Type: Bug Bug
    • Status: Resolved
    • Priority: Minor Minor
    • Resolution: Duplicate
    • Affects Version/s: 2.0.0-alpha, 3.0.0
    • Fix Version/s: None
    • Component/s: scripts
    • Labels:
      None
    • Target Version/s:

      Description

      remove duplicate VM arguments passed to hadoop daemon

      Following are the VM arguments currently duplicated.

      -Dproc_namenode
      -Xmx1000m
      -Djava.net.preferIPv4Stack=true
      -Xmx128m
      -Xmx128m
      -Dhadoop.log.dir=/home/nn2/logs
      -Dhadoop.log.file=hadoop-root-namenode-HOST-xx-xx-xx-105.log
      -Dhadoop.home.dir=/home/nn2/
      -Dhadoop.id.str=root
      -Dhadoop.root.logger=INFO,RFA
      -Dhadoop.policy.file=hadoop-policy.xml
      -Djava.net.preferIPv4Stack=true
      -Dhadoop.security.logger=INFO,RFAS
      -Dhdfs.audit.logger=INFO,NullAppender
      -Dhadoop.security.logger=INFO,RFAS
      -Dhdfs.audit.logger=INFO,NullAppender
      -Dhadoop.security.logger=INFO,RFAS
      -Dhdfs.audit.logger=INFO,NullAppender
      -Dhadoop.security.logger=INFO,RFAS

      In above VM argumants -Xmx1000m will be Overridden by -Xmx128m.
      BTW Other duplicate arguments wont harm

      1. HADOOP-8476.patch
        10 kB
        Vinayakumar B
      2. HADOOP-8476.patch
        5 kB
        Vinayakumar B

        Issue Links

          Activity

          Allen Wittenauer made changes -
          Component/s scripts [ 12311393 ]
          Component/s conf [ 12310711 ]
          Allen Wittenauer made changes -
          Link This issue Is contained by HADOOP-9902 [ HADOOP-9902 ]
          Vinayakumar B made changes -
          Status Open [ 1 ] Resolved [ 5 ]
          Resolution Duplicate [ 3 ]
          Hide
          Vinayakumar B added a comment -

          These are already handled as part of script re-write.

          Show
          Vinayakumar B added a comment - These are already handled as part of script re-write.
          Hide
          nijel added a comment -

          I also went wrong with duplicate values for mx !!

          Vinay, can you update the patch ?

          Show
          nijel added a comment - I also went wrong with duplicate values for mx !! Vinay, can you update the patch ?
          Vinayakumar B made changes -
          Status Patch Available [ 10002 ] Open [ 1 ]
          Hide
          Arpit Gupta added a comment -

          Vinay could you regenerate the patch, it does not apply on trunk.

          Also some comments/questions based on your patch file

          You have added an option to the hadoop-config.sh script to skip hadoop opts "--skip_hadoop_opts" and you are passing that in all the various places hadoop-config.sh is called and this skipping the setting of HADOOP_OPTS.

          I dont think we should make the assumption that people will have the appropriate values set in the env by the hadoop-env.sh config file. People change this config based on what their needs are and we cannot force them to have all of these defined. hadoop-config.sh made sure certain defaults are set.

          Show
          Arpit Gupta added a comment - Vinay could you regenerate the patch, it does not apply on trunk. Also some comments/questions based on your patch file You have added an option to the hadoop-config.sh script to skip hadoop opts "--skip_hadoop_opts" and you are passing that in all the various places hadoop-config.sh is called and this skipping the setting of HADOOP_OPTS. I dont think we should make the assumption that people will have the appropriate values set in the env by the hadoop-env.sh config file. People change this config based on what their needs are and we cannot force them to have all of these defined. hadoop-config.sh made sure certain defaults are set.
          Hide
          Hadoop QA added a comment -

          -1 overall. Here are the results of testing the latest attachment
          http://issues.apache.org/jira/secure/attachment/12541972/HADOOP-8476.patch
          against trunk revision .

          +1 @author. The patch does not contain any @author tags.

          -1 tests included. The patch doesn't appear to include any new or modified tests.
          Please justify why no new tests are needed for this patch.
          Also please list what manual steps were performed to verify this patch.

          +1 javac. The applied patch does not increase the total number of javac compiler warnings.

          +1 javadoc. The javadoc tool did not generate any warning messages.

          +1 eclipse:eclipse. The patch built with eclipse:eclipse.

          +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

          +1 release audit. The applied patch does not increase the total number of release audit warnings.

          +1 core tests. The patch passed unit tests in hadoop-common-project/hadoop-common.

          +1 contrib tests. The patch passed contrib unit tests.

          Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/1345//testReport/
          Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/1345//console

          This message is automatically generated.

          Show
          Hadoop QA added a comment - -1 overall. Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12541972/HADOOP-8476.patch against trunk revision . +1 @author. The patch does not contain any @author tags. -1 tests included. The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. +1 javac. The applied patch does not increase the total number of javac compiler warnings. +1 javadoc. The javadoc tool did not generate any warning messages. +1 eclipse:eclipse. The patch built with eclipse:eclipse. +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit. The applied patch does not increase the total number of release audit warnings. +1 core tests. The patch passed unit tests in hadoop-common-project/hadoop-common. +1 contrib tests. The patch passed contrib unit tests. Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/1345//testReport/ Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/1345//console This message is automatically generated.
          Vinayakumar B made changes -
          Attachment HADOOP-8476.patch [ 12541972 ]
          Hide
          Vinayakumar B added a comment -

          Attaching the correct patch to remove duplicate VM args

          Show
          Vinayakumar B added a comment - Attaching the correct patch to remove duplicate VM args
          Hide
          Hadoop QA added a comment -

          -1 overall. Here are the results of testing the latest attachment
          http://issues.apache.org/jira/secure/attachment/12530809/HADOOP-8476.patch
          against trunk revision .

          +1 @author. The patch does not contain any @author tags.

          -1 tests included. The patch doesn't appear to include any new or modified tests.
          Please justify why no new tests are needed for this patch.
          Also please list what manual steps were performed to verify this patch.

          +1 javac. The applied patch does not increase the total number of javac compiler warnings.

          +1 javadoc. The javadoc tool did not generate any warning messages.

          +1 eclipse:eclipse. The patch built with eclipse:eclipse.

          +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

          +1 release audit. The applied patch does not increase the total number of release audit warnings.

          -1 core tests. The patch failed these unit tests in hadoop-common-project/hadoop-common:

          org.apache.hadoop.fs.viewfs.TestViewFsTrash

          +1 contrib tests. The patch passed contrib unit tests.

          Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/1119//testReport/
          Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/1119//console

          This message is automatically generated.

          Show
          Hadoop QA added a comment - -1 overall. Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12530809/HADOOP-8476.patch against trunk revision . +1 @author. The patch does not contain any @author tags. -1 tests included. The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. +1 javac. The applied patch does not increase the total number of javac compiler warnings. +1 javadoc. The javadoc tool did not generate any warning messages. +1 eclipse:eclipse. The patch built with eclipse:eclipse. +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit. The applied patch does not increase the total number of release audit warnings. -1 core tests. The patch failed these unit tests in hadoop-common-project/hadoop-common: org.apache.hadoop.fs.viewfs.TestViewFsTrash +1 contrib tests. The patch passed contrib unit tests. Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/1119//testReport/ Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/1119//console This message is automatically generated.
          Vinayakumar B made changes -
          Status Open [ 1 ] Patch Available [ 10002 ]
          Vinayakumar B made changes -
          Attachment HADOOP-8476.patch [ 12530809 ]
          Hide
          Vinayakumar B added a comment -

          Attaching the patch which includes
          Above mentioned changes
          along with slaves.sh, and addition of HADOOP_ZKFC_OPTS from templates/conf/hadoop-env.sh

          Show
          Vinayakumar B added a comment - Attaching the patch which includes Above mentioned changes along with slaves.sh, and addition of HADOOP_ZKFC_OPTS from templates/conf/hadoop-env.sh
          Hide
          Vinayakumar B added a comment -

          following things will remove the duplicate VM args.

          1. Make following changes in hadoop-env.sh

          export HADOOP_OPTS=""
          
          export HADOOP_NAMENODE_OPTS="-Dhdfs.audit.logger=${HDFS_AUDIT_LOGGER:-INFO,NullAppender}"
          export HADOOP_DATANODE_OPTS=""
          export HADOOP_SECONDARYNAMENODE_OPTS="-Dhdfs.audit.logger=${HDFS_AUDIT_LOGGER:-INFO,NullAppender}"
          
          export HADOOP_CLIENT_OPTS="-Xmx128m"
          

          Above change is because, from any script hadoop-config.sh will be the first script called, which inturn call hadoop-env.sh. So appending not required.
          Anyway remaining args will be added inside hadoop-config.sh.

          Also hadoop-config.sh will be called again by hdfs-config.sh and yarn-config.sh.
          So args will be appending again and again.

          2. Remove calling hadoop-env.sh explicitly from hadoop-daemon.sh

          I will post patch in some time.

          Any comments on this.. ?

          Show
          Vinayakumar B added a comment - following things will remove the duplicate VM args. 1. Make following changes in hadoop-env.sh export HADOOP_OPTS="" export HADOOP_NAMENODE_OPTS="-Dhdfs.audit.logger=${HDFS_AUDIT_LOGGER:-INFO,NullAppender}" export HADOOP_DATANODE_OPTS="" export HADOOP_SECONDARYNAMENODE_OPTS="-Dhdfs.audit.logger=${HDFS_AUDIT_LOGGER:-INFO,NullAppender}" export HADOOP_CLIENT_OPTS="-Xmx128m" Above change is because, from any script hadoop-config.sh will be the first script called, which inturn call hadoop-env.sh. So appending not required. Anyway remaining args will be added inside hadoop-config.sh. Also hadoop-config.sh will be called again by hdfs-config.sh and yarn-config.sh. So args will be appending again and again. 2. Remove calling hadoop-env.sh explicitly from hadoop-daemon.sh I will post patch in some time. Any comments on this.. ?
          Vinayakumar B made changes -
          Description remove the duplicate the VM arguments passed to hadoop daemon

          Following are the VM arguments currently duplicated.
          {noformat}-Dproc_namenode
          -Xmx1000m
          -Djava.net.preferIPv4Stack=true
          -Xmx128m
          -Xmx128m
          -Dhadoop.log.dir=/home/nn2/logs
          -Dhadoop.log.file=hadoop-root-namenode-HOST-xx-xx-xx-105.log
          -Dhadoop.home.dir=/home/nn2/
          -Dhadoop.id.str=root
          -Dhadoop.root.logger=INFO,RFA
          -Dhadoop.policy.file=hadoop-policy.xml
          -Djava.net.preferIPv4Stack=true
          -Dhadoop.security.logger=INFO,RFAS
          -Dhdfs.audit.logger=INFO,NullAppender
          -Dhadoop.security.logger=INFO,RFAS
          -Dhdfs.audit.logger=INFO,NullAppender
          -Dhadoop.security.logger=INFO,RFAS
          -Dhdfs.audit.logger=INFO,NullAppender
          -Dhadoop.security.logger=INFO,RFAS{noformat}
           
          In above VM argumants -Xmx1000m will be Overridden by -Xmx128m.
          BTW Other duplicate arguments wont harm
          remove duplicate VM arguments passed to hadoop daemon

          Following are the VM arguments currently duplicated.
          {noformat}-Dproc_namenode
          -Xmx1000m
          -Djava.net.preferIPv4Stack=true
          -Xmx128m
          -Xmx128m
          -Dhadoop.log.dir=/home/nn2/logs
          -Dhadoop.log.file=hadoop-root-namenode-HOST-xx-xx-xx-105.log
          -Dhadoop.home.dir=/home/nn2/
          -Dhadoop.id.str=root
          -Dhadoop.root.logger=INFO,RFA
          -Dhadoop.policy.file=hadoop-policy.xml
          -Djava.net.preferIPv4Stack=true
          -Dhadoop.security.logger=INFO,RFAS
          -Dhdfs.audit.logger=INFO,NullAppender
          -Dhadoop.security.logger=INFO,RFAS
          -Dhdfs.audit.logger=INFO,NullAppender
          -Dhadoop.security.logger=INFO,RFAS
          -Dhdfs.audit.logger=INFO,NullAppender
          -Dhadoop.security.logger=INFO,RFAS{noformat}
           
          In above VM argumants -Xmx1000m will be Overridden by -Xmx128m.
          BTW Other duplicate arguments wont harm
          Vinayakumar B made changes -
          Field Original Value New Value
          Description remove the duplicate the VM arguments passed to hadoop daemon

          Following are the VM arguments currently duplicated.
          {noformat}-Dproc_namenode
          -Xmx1000m
          -Djava.net.preferIPv4Stack=true
          -Xmx128m
          -Xmx128m
          -Dhadoop.log.dir=/home/nn2/logs
          -Dhadoop.log.file=hadoop-root-namenode-HOST-xx-xx-xx-105.log
          -Dhadoop.home.dir=/home/nn2/
          -Dhadoop.id.str=root
          -Dhadoop.root.logger=INFO,RFA
          -Dhadoop.policy.file=hadoop-policy.xml
          -Djava.net.preferIPv4Stack=true
          -Dhadoop.security.logger=INFO,RFAS
          -Dhdfs.audit.logger=INFO,NullAppender
          -Dhadoop.security.logger=INFO,RFAS
          -Dhdfs.audit.logger=INFO,NullAppender
          -Dhadoop.security.logger=INFO,RFAS
          -Dhdfs.audit.logger=INFO,NullAppender
          -Xdebug
          -Xrunjdwp:transport=dt_socket,server=y,suspend=n,address=8001
          -Dhadoop.security.logger=INFO,RFAS{noformat}
           
          In above VM argumants -Xmx1000m will be Overridden by -Xmx128m.
          BTW Other duplicate arguments wont harm
          remove the duplicate the VM arguments passed to hadoop daemon

          Following are the VM arguments currently duplicated.
          {noformat}-Dproc_namenode
          -Xmx1000m
          -Djava.net.preferIPv4Stack=true
          -Xmx128m
          -Xmx128m
          -Dhadoop.log.dir=/home/nn2/logs
          -Dhadoop.log.file=hadoop-root-namenode-HOST-xx-xx-xx-105.log
          -Dhadoop.home.dir=/home/nn2/
          -Dhadoop.id.str=root
          -Dhadoop.root.logger=INFO,RFA
          -Dhadoop.policy.file=hadoop-policy.xml
          -Djava.net.preferIPv4Stack=true
          -Dhadoop.security.logger=INFO,RFAS
          -Dhdfs.audit.logger=INFO,NullAppender
          -Dhadoop.security.logger=INFO,RFAS
          -Dhdfs.audit.logger=INFO,NullAppender
          -Dhadoop.security.logger=INFO,RFAS
          -Dhdfs.audit.logger=INFO,NullAppender
          -Dhadoop.security.logger=INFO,RFAS{noformat}
           
          In above VM argumants -Xmx1000m will be Overridden by -Xmx128m.
          BTW Other duplicate arguments wont harm
          Vinayakumar B created issue -

            People

            • Assignee:
              Vinayakumar B
              Reporter:
              Vinayakumar B
            • Votes:
              0 Vote for this issue
              Watchers:
              6 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development