Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
None
-
None
-
None
-
None
-
Operating System: other
Platform: Other
-
37300
Description
If you run the string "09" through the Java integer validator
(org.apache.commons.validator.GenericTypeValidator.formatInt), it validates
successfully as an integer (as the number 9, actually).
If you run the same string through the Javascript integer validator
(org/apache/commons/validator/javascript/validateInteger.js), it signals an
error that the string is not a valid number.
The problem is that Javascript's parseInt function (used in the Javascript
integer validation function) does radix snooping and uses the (pretty standard)
"0x" prefix to indicate a hexidecimal and a leading "0" to incidate an octal
value. On the other hand, the Java version of the validator uses "new
Integer(String)" to create a new Integer object.
The JavaDoc for Integer.<init>(String) states:
"The string is converted to an int value in exactly the manner used by the
parseInt method for radix 10."
So, the Java version explicitly uses radix 10, while the Javascript version will
auto-detect the radix.
I'm not sure if I care which way this goes, but these two ought to be consistent.
To "fix" the Javascript version, one technique would be to trim-off leading
zeros (not sure what to do about "0xf00"). To "fix" the Java version, we should
use Integer.decode(String) instead of new Integer(String).