Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-8309

OpenHashMap doesn't work with more than 12M items

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Critical
    • Resolution: Fixed
    • Affects Version/s: 1.3.1, 1.4.0
    • Fix Version/s: 1.3.2, 1.4.1, 1.5.0
    • Component/s: Spark Core
    • Labels:
      None

      Description

      The problem might be demonstrated with the following testcase:

        test("support for more than 12M items") {
          val cnt = 12000000 // 12M
          val map = new OpenHashMap[Int, Int](cnt)
          for (i <- 0 until cnt) {
            map(i) = 1
          }
          val numInvalidValues = map.iterator.count(_._2 == 0)
          assertResult(0)(numInvalidValues)
        }
      
      

        Attachments

          Activity

            People

            • Assignee:
              slavik.baranov Vyacheslav Baranov
              Reporter:
              wildfire Vyacheslav Baranov
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: