Uploaded image for project: 'Jackrabbit Oak'
  1. Jackrabbit Oak
  2. OAK-8565

Using a lazy blob listing can cause Azure timeout

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Closed
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 1.18.0
    • Component/s: segment-azure
    • Labels:
      None

      Description

      When trying to list the Azure blob in a lazy way, the API client loads the listing in a few segments. If there's too much time between segment requests (eg. because we're processing each blob), it can result in TimeoutException:

      [FelixStartLevel] org.apache.jackrabbit.oak-segment-tar bundle org.apache.jackrabbit.oak-segment-tar:1.16.0.R1864511 (144)[org.apache.jackrabbit.oak.segment.SegmentNodeStoreFactory(204)] : The activate method has thrown an exception (java.util.NoSuchElementException: An error occurred while enumerating the result, check the original exception for details.)
      java.util.NoSuchElementException: An error occurred while enumerating the result, check the original exception for details.
      	at com.microsoft.azure.storage.core.LazySegmentedIterator.hasNext(LazySegmentedIterator.java:113)
      	at java.base/java.util.Iterator.forEachRemaining(Iterator.java:132)
      	at java.base/java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
      	at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
      	at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
      	at java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
      	at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
      	at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
      	at java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
      	at org.apache.jackrabbit.oak.segment.azure.AzureArchiveManager.renameTo(AzureArchiveManager.java:134) [org.apache.jackrabbit.oak-segment-azure:1.16.0.R1864511]
      	at org.apache.jackrabbit.oak.segment.file.tar.TarReader.backupSafely(TarReader.java:208) [org.apache.jackrabbit.oak-segment-tar:1.16.0.R1864511]
      	at org.apache.jackrabbit.oak.segment.file.tar.TarReader.collectFileEntries(TarReader.java:154) [org.apache.jackrabbit.oak-segment-tar:1.16.0.R1864511]
      	at org.apache.jackrabbit.oak.segment.file.tar.TarReader.open(TarReader.java:99) [org.apache.jackrabbit.oak-segment-tar:1.16.0.R1864511]
      	at org.apache.jackrabbit.oak.segment.file.tar.TarFiles.<init>(TarFiles.java:395) [org.apache.jackrabbit.oak-segment-tar:1.16.0.R1864511]
      ...
      Caused by: com.microsoft.azure.storage.StorageException: The client could not finish the operation within specified maximum execution timeout.
      	at com.microsoft.azure.storage.core.ExecutionEngine.setupStorageRequest(ExecutionEngine.java:277)
      	at com.microsoft.azure.storage.core.ExecutionEngine.executeWithRetry(ExecutionEngine.java:95)
      	at com.microsoft.azure.storage.core.LazySegmentedIterator.hasNext(LazySegmentedIterator.java:109)
      	... 58 common frames omitted
      Caused by: java.util.concurrent.TimeoutException: The client could not finish the operation within specified maximum execution timeout.
      	at com.microsoft.azure.storage.core.ExecutionEngine.setupStorageRequest(ExecutionEngine.java:276)
      	... 60 common frames omitted
      

      We should update the code to load all blobs into a list and return it.

        Attachments

          Activity

            People

            • Assignee:
              tomek.rekawek Tomek Rękawek
              Reporter:
              tomek.rekawek Tomek Rękawek
            • Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: