Details

    • Type: Sub-task
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 2.7.0
    • Fix Version/s: 3.0.0-alpha1
    • Component/s: fs/s3
    • Labels:
      None
    • Target Version/s:

      Description

      Currently, if fs.s3a.max.total.tasks are queued and another (part)upload wants to start, a RejectedExecutionException is thrown.

      We should use a threadpool that blocks clients, nicely throtthling them, rather than throwing an exception. F.i. something similar to https://github.com/apache/incubator-s4/blob/master/subprojects/s4-comm/src/main/java/org/apache/s4/comm/staging/BlockingThreadPoolExecutorService.java

        Attachments

        1. HADOOP-11684-001.patch
          27 kB
          Thomas Demoor
        2. HADOOP-11684-002.patch
          23 kB
          Thomas Demoor
        3. HADOOP-11684-003.patch
          9 kB
          Thomas Demoor
        4. HADOOP-11684-004.patch
          29 kB
          Aaron Fabbri
        5. HADOOP-11684-005.patch
          29 kB
          Aaron Fabbri
        6. HADOOP-11684-006.patch
          29 kB
          Aaron Fabbri

          Issue Links

            Activity

              People

              • Assignee:
                thodemoor Thomas Demoor
                Reporter:
                thodemoor Thomas Demoor
              • Votes:
                1 Vote for this issue
                Watchers:
                13 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: