-
Type:
Bug
-
Status: Resolved
-
Priority:
Major
-
Resolution: Fixed
-
Affects Version/s: 1.5.1, 1.6.0, 1.6.1, 2.0.0
-
Component/s: Block Manager, Spark Core
-
Labels:None
`spark.storage.memoryMapThreshold` has two kind of the value, one is 2*1024*1024 as integer and the other one is '2m' as string.
"2m" is recommanded in document but it will go wrong if the code goes into TransportConf#memoryMapBytes.
Useage of the `spark.storage.memoryMapThreshold`: