Uploaded image for project: 'Jackrabbit Oak'
  1. Jackrabbit Oak
  2. OAK-6452

IllegalStateException: too much data for a segment during oak-upgrade from segment to segment-tar

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Critical
    • Resolution: Cannot Reproduce
    • 1.7.3
    • 1.7.6, 1.8.0
    • segment-tar, upgrade
    • None

    Description

      During the migration of a big repo from the old-segment format to segment-tar using oak-upgrade-1.7.3, I've got the following error:

      14.07.2017 09:05:51.920 [main] *INFO*  org.apache.jackrabbit.oak.upgrade.RepositorySidegrade - Copying node #893330000: /oak:index/uuid/:index/a9f9a3ed-6183-4e9e-9480-1b4fd196a829
      14.07.2017 10:00:27.957 [TarMK flush [extracted/crx-quickstart/repository-oak-upgrade/segmentstore]] *ERROR*  org.apache.jackrabbit.oak.segment.file.SafeRunnable - Uncaught exception in TarMK flush [extracted/crx-quickstart/repository-oak-upgrade/segmentstore]
      java.lang.IllegalStateException: too much data for a segment
              at org.apache.jackrabbit.oak.segment.SegmentBufferWriter.flush(SegmentBufferWriter.java:322) ~[oak-upgrade-1.7.3.jar:1.7.3]
              at org.apache.jackrabbit.oak.segment.SegmentBufferWriterPool.flush(SegmentBufferWriterPool.java:142) ~[oak-upgrade-1.7.3.jar:1.7.3]
              at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter.flush(DefaultSegmentWriter.java:138) ~[oak-upgrade-1.7.3.jar:1.7.3]
              at org.apache.jackrabbit.oak.segment.file.FileStore$8.call(FileStore.java:345) ~[oak-upgrade-1.7.3.jar:1.7.3]
              at org.apache.jackrabbit.oak.segment.file.FileStore$8.call(FileStore.java:342) ~[oak-upgrade-1.7.3.jar:1.7.3]
              at org.apache.jackrabbit.oak.segment.file.TarRevisions.doFlush(TarRevisions.java:213) ~[oak-upgrade-1.7.3.jar:1.7.3]
              at org.apache.jackrabbit.oak.segment.file.TarRevisions.flush(TarRevisions.java:201) ~[oak-upgrade-1.7.3.jar:1.7.3]
              at org.apache.jackrabbit.oak.segment.file.FileStore.flush(FileStore.java:342) ~[oak-upgrade-1.7.3.jar:1.7.3]
              at org.apache.jackrabbit.oak.segment.file.FileStore$3.run(FileStore.java:242) ~[oak-upgrade-1.7.3.jar:1.7.3]
              at org.apache.jackrabbit.oak.segment.file.SafeRunnable.run(SafeRunnable.java:67) ~[oak-upgrade-1.7.3.jar:1.7.3]
              at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_131]
              at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [na:1.8.0_131]
              at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [na:1.8.0_131]
              at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [na:1.8.0_131]
              at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_131]
              at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_131]
              at java.lang.Thread.run(Thread.java:748) [na:1.8.0_131]
      14.07.2017 10:00:28.448 [main] *INFO*  org.apache.jackrabbit.oak.segment.file.FileStore - TarMK closed: extracted/crx-quickstart/repository-oak-upgrade/segmentstore
      14.07.2017 10:00:28.658 [main] *INFO*  org.apache.jackrabbit.oak.plugins.segment.file.FileStore - TarMK closed: extracted/crx-quickstart/repository/segmentstore
      Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
              at org.apache.jackrabbit.oak.segment.RecordType.values(RecordType.java:24)
              at org.apache.jackrabbit.oak.segment.ImmutableRecordNumbers$1$1.getType(ImmutableRecordNumbers.java:86)
              at org.apache.jackrabbit.oak.segment.Segment.forEachRecord(Segment.java:703)
              at org.apache.jackrabbit.oak.segment.file.AbstractFileStore.readBinaryReferences(AbstractFileStore.java:277)
              at org.apache.jackrabbit.oak.segment.file.FileStore.writeSegment(FileStore.java:511)
              at org.apache.jackrabbit.oak.segment.SegmentBufferWriter.flush(SegmentBufferWriter.java:356)
              at org.apache.jackrabbit.oak.segment.SegmentBufferWriter.prepare(SegmentBufferWriter.java:423)
              at org.apache.jackrabbit.oak.segment.RecordWriters$RecordWriter.write(RecordWriters.java:70)
              at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeValueRecord(DefaultSegmentWriter.java:499)
              at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeString(DefaultSegmentWriter.java:518)
              at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeMap(DefaultSegmentWriter.java:324)
              at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:870)
              at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:804)
              at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:872)
              at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:804)
              at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:867)
              at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:804)
              at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:867)
              at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:804)
              at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.access$800(DefaultSegmentWriter.java:257)
              at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$8.execute(DefaultSegmentWriter.java:246)
              at org.apache.jackrabbit.oak.segment.SegmentBufferWriterPool.execute(SegmentBufferWriterPool.java:100)
              at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter.writeNode(DefaultSegmentWriter.java:242)
              at org.apache.jackrabbit.oak.segment.SegmentWriter.writeNode(SegmentWriter.java:141)
              at org.apache.jackrabbit.oak.segment.SegmentNodeBuilder.getNodeState(SegmentNodeBuilder.java:132)
              at org.apache.jackrabbit.oak.segment.scheduler.Commit.hasChanges(Commit.java:102)
      		at org.apache.jackrabbit.oak.segment.scheduler.LockBasedScheduler.execute(LockBasedScheduler.java:229)
              at org.apache.jackrabbit.oak.segment.scheduler.LockBasedScheduler.schedule(LockBasedScheduler.java:208)
              at org.apache.jackrabbit.oak.segment.SegmentNodeStore.merge(SegmentNodeStore.java:195)
              at org.apache.jackrabbit.oak.spi.state.ProxyNodeStore.merge(ProxyNodeStore.java:43)
              at org.apache.jackrabbit.oak.upgrade.RepositorySidegrade.migrateWithCheckpoints(RepositorySidegrade.java:373)
              at org.apache.jackrabbit.oak.upgrade.RepositorySidegrade.copyState(RepositorySidegrade.java:338)
      Command exited with non-zero status 1
      

      The command used:

      java -Xmx20G -XX:MaxPermSize=2048M -jar tools/oak-upgrade-1.7.3.jar segment-old:crx-quickstart/repository/ crx-quickstart/repository-oak-upgrade
      

      Note that when this happened, the logs showed it had copied ~156M nodes in /oak:index/uuid/ (direct children).

      I previously tried with only 10G heap, and it failed faster with OOM.

      Attachments

        Issue Links

          Activity

            People

              volteanu Valentin Olteanu
              volteanu Valentin Olteanu
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: