Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-25827

Replicating a block > 2gb with encryption fails

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 2.4.0
    • Fix Version/s: 2.4.1, 3.0.0
    • Component/s: Spark Core
    • Labels:
      None

      Description

      There are a couple of issues with replicating & remote reads of large encrypted blocks, which try to create buffers where they shouldn't. Some of this is properly limiting the size of arrays under SPARK-25904, but there are others specific to encryption & trying to convert EncryptedBlockData into a regular ByteBuffer.

      EDIT: moved general array size stuff under SPARK-25904.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                irashid Imran Rashid
                Reporter:
                irashid Imran Rashid
              • Votes:
                0 Vote for this issue
                Watchers:
                3 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: