Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
0.9.3
-
None
Description
When reading in a string-to-double map from the identical file using the Compact protocol, Python gives the correct values:
...
u'roh': -12.012431158160835
...
but Haskell is totally off:
...
("roh",6.355136015066463e-157)
...
The funny thing is, if I read it into Haskell (and the numbers are all off), then write it out to another file, that file still has correct numbers when loaded into Python. So it seems that the raw information is being (de)serialized correctly at the bit-level, but Haskell isn't interpreting it as a double the same way as Python...