Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-21923

Avoid calling reserveUnrollMemoryForThisTask for every record

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.2.0
    • 2.3.0
    • Spark Core
    • None

    Description

      When Spark persist data to Unsafe memory, we call the method `MemoryStore.putIteratorAsBytes`, which need synchronize the `memoryManager` for every record write. This implementation is not necessary, we can apply for more memory at a time to reduce unnecessary synchronization.

      Test case:
      ```scala
      val start = System.currentTimeMillis()
      val data = sc.parallelize(0 until Integer.MAX_VALUE, 100)
      .persist(StorageLevel.OFF_HEAP)
      .count()

      println(System.currentTimeMillis() - start)

      ```

      Test result:

      before

      27647 29108 28591 28264 27232

      after

      26868 26358 27767 26653 26693

      Attachments

        Activity

          People

            coneyliu Xianyang Liu
            coneyliu Xianyang Liu
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: