Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-18827

Cann't read broadcast if broadcast blocks are stored on-disk

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.0.1, 2.0.2, 2.1.0
    • 2.0.3, 2.1.1
    • Spark Core
    • None

    Description

      How to reproduce it:

        test("Cache broadcast to disk") {
          val conf = new SparkConf()
            .setAppName("Cache broadcast to disk")
            .setMaster("local")
            .set("spark.memory.useLegacyMode", "true")
            .set("spark.storage.memoryFraction", "0.0")
          sc = new SparkContext(conf)
          val list = List[Int](1, 2, 3, 4)
          val broadcast = sc.broadcast(list)
          assert(broadcast.value.sum === 10)
        }
      

      NoSuchElementException will throw since SPARK-17503 if a broadcast cannot cache in memory. The reason is that that change cannot cover !unrolled.hasNext in next() function.

      Attachments

        1. NoSuchElementException4722.gif
          118 kB
          Yuming Wang

        Issue Links

          Activity

            People

              yumwang Yuming Wang
              yumwang Yuming Wang
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: