Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Fixed
-
2.0.1, 2.0.2, 2.1.0
-
None
Description
How to reproduce it:
test("Cache broadcast to disk") { val conf = new SparkConf() .setAppName("Cache broadcast to disk") .setMaster("local") .set("spark.memory.useLegacyMode", "true") .set("spark.storage.memoryFraction", "0.0") sc = new SparkContext(conf) val list = List[Int](1, 2, 3, 4) val broadcast = sc.broadcast(list) assert(broadcast.value.sum === 10) }
NoSuchElementException will throw since SPARK-17503 if a broadcast cannot cache in memory. The reason is that that change cannot cover !unrolled.hasNext in next() function.
Attachments
Attachments
Issue Links
- is duplicated by
-
SPARK-21794 exception about reading task serial data(broadcast) value when the storage memory is not enough to unroll
- Closed
- links to