Hadoop Common
  1. Hadoop Common
  2. HADOOP-2081

Configuration getInt, getLong, and getFloat replace invalid numbers with the default value

    Details

    • Type: Bug Bug
    • Status: Closed
    • Priority: Major Major
    • Resolution: Fixed
    • Affects Version/s: 0.22.0
    • Fix Version/s: 0.23.0
    • Component/s: conf
    • Labels:
      None
    • Hadoop Flags:
      Incompatible change, Reviewed
    • Release Note:
      Invalid configuration values now result in a number format exception rather than the default value being used.

      Description

      The current behavior of silently replace invalid numbers in the configuration with the default value leads to hard to diagnose problems. The methods should throw an exception to signal that the input was invalid.

      1. HADOOP-2081.r1.diff
        5 kB
        Harsh J
      2. HADOOP-2081.r2.diff
        7 kB
        Harsh J
      3. HADOOP-2081.r3.diff
        7 kB
        Harsh J

        Activity

        Hide
        Harsh J added a comment -

        Patch that lets configuration throw up NumberFormatExceptions in cases of get

        {Int,Long,Float}

        .

        Boolean makes sense to remain as-is, with defaults coming up instead of a generic 'false' if one were to use Boolean.parseBoolean(…).

        Show
        Harsh J added a comment - Patch that lets configuration throw up NumberFormatExceptions in cases of get {Int,Long,Float} . Boolean makes sense to remain as-is, with defaults coming up instead of a generic 'false' if one were to use Boolean.parseBoolean(…).
        Hide
        Hadoop QA added a comment -

        +1 overall. Here are the results of testing the latest attachment
        http://issues.apache.org/jira/secure/attachment/12486745/HADOOP-2081.r1.diff
        against trunk revision 1147317.

        +1 @author. The patch does not contain any @author tags.

        +1 tests included. The patch appears to include 3 new or modified tests.

        +1 javadoc. The javadoc tool did not generate any warning messages.

        +1 javac. The applied patch does not increase the total number of javac compiler warnings.

        +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

        +1 release audit. The applied patch does not increase the total number of release audit warnings.

        +1 core tests. The patch passed core unit tests.

        +1 system test framework. The patch passed system test framework compile.

        Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/736//testReport/
        Findbugs warnings: https://builds.apache.org/job/PreCommit-HADOOP-Build/736//artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html
        Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/736//console

        This message is automatically generated.

        Show
        Hadoop QA added a comment - +1 overall. Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12486745/HADOOP-2081.r1.diff against trunk revision 1147317. +1 @author. The patch does not contain any @author tags. +1 tests included. The patch appears to include 3 new or modified tests. +1 javadoc. The javadoc tool did not generate any warning messages. +1 javac. The applied patch does not increase the total number of javac compiler warnings. +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit. The applied patch does not increase the total number of release audit warnings. +1 core tests. The patch passed core unit tests. +1 system test framework. The patch passed system test framework compile. Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/736//testReport/ Findbugs warnings: https://builds.apache.org/job/PreCommit-HADOOP-Build/736//artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/736//console This message is automatically generated.
        Hide
        Harsh J added a comment -

        Iterative patch:

        + Javadocs changes to API

        Show
        Harsh J added a comment - Iterative patch: + Javadocs changes to API
        Hide
        Hadoop QA added a comment -

        +1 overall. Here are the results of testing the latest attachment
        http://issues.apache.org/jira/secure/attachment/12486746/HADOOP-2081.r2.diff
        against trunk revision 1147317.

        +1 @author. The patch does not contain any @author tags.

        +1 tests included. The patch appears to include 3 new or modified tests.

        +1 javadoc. The javadoc tool did not generate any warning messages.

        +1 javac. The applied patch does not increase the total number of javac compiler warnings.

        +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

        +1 release audit. The applied patch does not increase the total number of release audit warnings.

        +1 core tests. The patch passed core unit tests.

        +1 system test framework. The patch passed system test framework compile.

        Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/737//testReport/
        Findbugs warnings: https://builds.apache.org/job/PreCommit-HADOOP-Build/737//artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html
        Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/737//console

        This message is automatically generated.

        Show
        Hadoop QA added a comment - +1 overall. Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12486746/HADOOP-2081.r2.diff against trunk revision 1147317. +1 @author. The patch does not contain any @author tags. +1 tests included. The patch appears to include 3 new or modified tests. +1 javadoc. The javadoc tool did not generate any warning messages. +1 javac. The applied patch does not increase the total number of javac compiler warnings. +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit. The applied patch does not increase the total number of release audit warnings. +1 core tests. The patch passed core unit tests. +1 system test framework. The patch passed system test framework compile. Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/737//testReport/ Findbugs warnings: https://builds.apache.org/job/PreCommit-HADOOP-Build/737//artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/737//console This message is automatically generated.
        Hide
        Uma Maheswara Rao G added a comment -

        Hi Hash,

        What about the API signatures? looks still they will expect default value right, which will be unused now in definition.
        If we change the public api signature ,that will be compatibility change.
        So, to avoid this, can we provide overload apis for this change? which will not take default value.

        Show
        Uma Maheswara Rao G added a comment - Hi Hash, What about the API signatures? looks still they will expect default value right, which will be unused now in definition. If we change the public api signature ,that will be compatibility change. So, to avoid this, can we provide overload apis for this change? which will not take default value.
        Hide
        Harsh J added a comment -

        Uma,

        Passed defaults will still be returned out if there is no value in the configuration. Just not when there's a NumberFormatException (in which case, that gets simply thrown out, but doesn't require a 'throws' statement, which'd break binary compatibility?)

        I have not changed the method signatures but this is indeed an incompatible change (have marked it as well) given that it changes something that was previously expected of it.

        What form of overload API signature are you suggesting, however?

        Show
        Harsh J added a comment - Uma, Passed defaults will still be returned out if there is no value in the configuration. Just not when there's a NumberFormatException (in which case, that gets simply thrown out, but doesn't require a 'throws' statement, which'd break binary compatibility?) I have not changed the method signatures but this is indeed an incompatible change (have marked it as well) given that it changes something that was previously expected of it. What form of overload API signature are you suggesting, however?
        Hide
        Uma Maheswara Rao G added a comment -

        Passed defaults will still be returned out if there is no value in the configuration.

        Got it. I looked into your patch. This is fine. If there is no value in config, still you retained the same behaviour.
        Changes looks fine for me.

        Show
        Uma Maheswara Rao G added a comment - Passed defaults will still be returned out if there is no value in the configuration. Got it. I looked into your patch. This is fine. If there is no value in config, still you retained the same behaviour. Changes looks fine for me.
        Hide
        Uma Maheswara Rao G added a comment -
        +   * If no such property exists, or if the specified value is not a valid
        +   * <code>float</code>, then an error is thrown.
        

        I think you need to update this javadoc right? As per this, even If no such property exists also throws error.
        Same case for other APIs also.

        Show
        Uma Maheswara Rao G added a comment - + * If no such property exists, or if the specified value is not a valid + * <code> float </code>, then an error is thrown. I think you need to update this javadoc right? As per this, even If no such property exists also throws error. Same case for other APIs also.
        Hide
        Harsh J added a comment -

        Uma,

        Thanks for catching that bad javadoc. I think I see now how the confusion arose, apologies!

        + Fixed javadocs to reflect the default value action. Thanks Uma!

        Show
        Harsh J added a comment - Uma, Thanks for catching that bad javadoc. I think I see now how the confusion arose, apologies! + Fixed javadocs to reflect the default value action. Thanks Uma!
        Hide
        Hadoop QA added a comment -

        +1 overall. Here are the results of testing the latest attachment
        http://issues.apache.org/jira/secure/attachment/12486773/HADOOP-2081.r3.diff
        against trunk revision 1147317.

        +1 @author. The patch does not contain any @author tags.

        +1 tests included. The patch appears to include 3 new or modified tests.

        +1 javadoc. The javadoc tool did not generate any warning messages.

        +1 javac. The applied patch does not increase the total number of javac compiler warnings.

        +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings.

        +1 release audit. The applied patch does not increase the total number of release audit warnings.

        +1 core tests. The patch passed core unit tests.

        +1 system test framework. The patch passed system test framework compile.

        Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/739//testReport/
        Findbugs warnings: https://builds.apache.org/job/PreCommit-HADOOP-Build/739//artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html
        Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/739//console

        This message is automatically generated.

        Show
        Hadoop QA added a comment - +1 overall. Here are the results of testing the latest attachment http://issues.apache.org/jira/secure/attachment/12486773/HADOOP-2081.r3.diff against trunk revision 1147317. +1 @author. The patch does not contain any @author tags. +1 tests included. The patch appears to include 3 new or modified tests. +1 javadoc. The javadoc tool did not generate any warning messages. +1 javac. The applied patch does not increase the total number of javac compiler warnings. +1 findbugs. The patch does not introduce any new Findbugs (version 1.3.9) warnings. +1 release audit. The applied patch does not increase the total number of release audit warnings. +1 core tests. The patch passed core unit tests. +1 system test framework. The patch passed system test framework compile. Test results: https://builds.apache.org/job/PreCommit-HADOOP-Build/739//testReport/ Findbugs warnings: https://builds.apache.org/job/PreCommit-HADOOP-Build/739//artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html Console output: https://builds.apache.org/job/PreCommit-HADOOP-Build/739//console This message is automatically generated.
        Hide
        Eli Collins added a comment -

        +1 latest patch looks good.

        Show
        Eli Collins added a comment - +1 latest patch looks good.
        Hide
        Eli Collins added a comment -

        I've committed this. Thanks Harsh!

        Show
        Eli Collins added a comment - I've committed this. Thanks Harsh!
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Common-trunk-Commit #694 (See https://builds.apache.org/job/Hadoop-Common-trunk-Commit/694/)
        HADOOP-2081. Configuration getInt, getLong, and getFloat replace invalid numbers with the default value. Contributed by Harsh J

        eli : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1147971
        Files :

        • /hadoop/common/trunk/common/CHANGES.txt
        • /hadoop/common/trunk/common/src/test/core/org/apache/hadoop/conf/TestConfiguration.java
        • /hadoop/common/trunk/common/src/java/org/apache/hadoop/conf/Configuration.java
        Show
        Hudson added a comment - Integrated in Hadoop-Common-trunk-Commit #694 (See https://builds.apache.org/job/Hadoop-Common-trunk-Commit/694/ ) HADOOP-2081 . Configuration getInt, getLong, and getFloat replace invalid numbers with the default value. Contributed by Harsh J eli : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1147971 Files : /hadoop/common/trunk/common/CHANGES.txt /hadoop/common/trunk/common/src/test/core/org/apache/hadoop/conf/TestConfiguration.java /hadoop/common/trunk/common/src/java/org/apache/hadoop/conf/Configuration.java
        Hide
        Hudson added a comment -

        Integrated in Hadoop-Common-trunk #751 (See https://builds.apache.org/job/Hadoop-Common-trunk/751/)
        HADOOP-2081. Configuration getInt, getLong, and getFloat replace invalid numbers with the default value. Contributed by Harsh J

        eli : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1147971
        Files :

        • /hadoop/common/trunk/common/CHANGES.txt
        • /hadoop/common/trunk/common/src/test/core/org/apache/hadoop/conf/TestConfiguration.java
        • /hadoop/common/trunk/common/src/java/org/apache/hadoop/conf/Configuration.java
        Show
        Hudson added a comment - Integrated in Hadoop-Common-trunk #751 (See https://builds.apache.org/job/Hadoop-Common-trunk/751/ ) HADOOP-2081 . Configuration getInt, getLong, and getFloat replace invalid numbers with the default value. Contributed by Harsh J eli : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1147971 Files : /hadoop/common/trunk/common/CHANGES.txt /hadoop/common/trunk/common/src/test/core/org/apache/hadoop/conf/TestConfiguration.java /hadoop/common/trunk/common/src/java/org/apache/hadoop/conf/Configuration.java
        Hide
        shobin joseph added a comment -

        This fix will create problems when we run the vaidya scrpit in hadoop-2.0.0-mr1-cdh4.1.2.

        Exception:java.lang.NumberFormatException: For input string: "0.05"java.lang.NumberFormatException: For input string: "0.05"
        at java.lang.NumberFormatException.forInputString(NumberFormatException.java:48)
        at java.lang.Integer.parseInt(Integer.java:458)
        at java.lang.Integer.parseInt(Integer.java:499)
        at org.apache.hadoop.conf.Configuration.getInt(Configuration.java:887)
        at org.apache.hadoop.vaidya.postexdiagnosis.tests.MapSideDiskSpill.getPrescription(MapSideDiskSpill.java:99)
        at org.apache.hadoop.vaidya.DiagnosticTest.getReportElement(DiagnosticTest.java:283)
        at org.apache.hadoop.vaidya.postexdiagnosis.PostExPerformanceDiagnoser.main(PostExPerformanceDiagnoser.java:254)

        In the job conf file
        the properties are like

        <property><name>mapred.reduce.slowstart.completed.maps</name><value>0.05</value><source>mapred-default.xml</source><source>/tmp/hadoop-527022/mapred/local/jobTracker/job_201302121643_0003.xml</source></property>

        <property><name>io.sort.record.percent</name><value>0.05</value><source>mapred-default.xml</source><source>/tmp/hadoop-527022/mapred/local/jobTracker/job_201302121643_0003.xml</source></property>

        <property><name>io.sort.spill.percent</name><value>0.80</value><source>mapred-default.xml</source><source>/tmp/hadoop-527022/mapred/local/jobTracker/job_201302121643_0003.xml</source></property>

        all the above lines throw the exception.

        The important thing is default values for the above properties are 0.05 and 0.8

        Show
        shobin joseph added a comment - This fix will create problems when we run the vaidya scrpit in hadoop-2.0.0-mr1-cdh4.1.2. Exception:java.lang.NumberFormatException: For input string: "0.05"java.lang.NumberFormatException: For input string: "0.05" at java.lang.NumberFormatException.forInputString(NumberFormatException.java:48) at java.lang.Integer.parseInt(Integer.java:458) at java.lang.Integer.parseInt(Integer.java:499) at org.apache.hadoop.conf.Configuration.getInt(Configuration.java:887) at org.apache.hadoop.vaidya.postexdiagnosis.tests.MapSideDiskSpill.getPrescription(MapSideDiskSpill.java:99) at org.apache.hadoop.vaidya.DiagnosticTest.getReportElement(DiagnosticTest.java:283) at org.apache.hadoop.vaidya.postexdiagnosis.PostExPerformanceDiagnoser.main(PostExPerformanceDiagnoser.java:254) In the job conf file the properties are like <property><name>mapred.reduce.slowstart.completed.maps</name><value>0.05</value><source>mapred-default.xml</source><source>/tmp/hadoop-527022/mapred/local/jobTracker/job_201302121643_0003.xml</source></property> <property><name>io.sort.record.percent</name><value>0.05</value><source>mapred-default.xml</source><source>/tmp/hadoop-527022/mapred/local/jobTracker/job_201302121643_0003.xml</source></property> <property><name>io.sort.spill.percent</name><value>0.80</value><source>mapred-default.xml</source><source>/tmp/hadoop-527022/mapred/local/jobTracker/job_201302121643_0003.xml</source></property> all the above lines throw the exception. The important thing is default values for the above properties are 0.05 and 0.8
        Hide
        shobin joseph added a comment -

        Harsh J Eli CollinsHadoop QAOwen O'Malley: Please Go through my above Post.

        Show
        shobin joseph added a comment - Harsh J Eli Collins Hadoop QA Owen O'Malley : Please Go through my above Post.
        Hide
        Harsh J added a comment -

        Hi Shobin,

        I believe you should file a new bug against Vaidya. If it expects to read a float (mostly percentages) value as int, it is doing things wrong and will almost always read them as 0, and do its computation inaccurately. Please do file a new MR JIRA against Vaidya component, preferably with a patch that changes its getInt call to someting more appropriate/correct.

        This change is not the real bug you're facing, its just bought it to the front.

        Show
        Harsh J added a comment - Hi Shobin, I believe you should file a new bug against Vaidya. If it expects to read a float (mostly percentages) value as int, it is doing things wrong and will almost always read them as 0, and do its computation inaccurately. Please do file a new MR JIRA against Vaidya component, preferably with a patch that changes its getInt call to someting more appropriate/correct. This change is not the real bug you're facing, its just bought it to the front.

          People

          • Assignee:
            Harsh J
            Reporter:
            Owen O'Malley
          • Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development