Uploaded image for project: 'HBase'
  1. HBase
  2. HBASE-554

filters generate StackOverflowException

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Blocker
    • Resolution: Fixed
    • 0.16.0, 0.1.0, 0.2.0
    • 0.1.1, 0.2.0
    • Filters
    • None

    Description

      Below is from list.

      You're doing nothing wrong.

      The filters as written recurse until they find a match. If long stretches between matching rows, then you will get a StackOverflowError. Filters need to be changed. Thanks for pointing this out. Can you do without them for the moment until we get a chance to fix it?

      St.Ack

      David Alves wrote:
      > Hi St.Ack and all
      >
      > The error always occurs when trying to see if there are more rows to
      > process.
      > Yes I'm using a filter(RegExpRowFilter) to select only the rows (any
      > row key) that match a specific value in one of the columns.
      > Then I obtain the scanner just test the hasNext method, close the
      > scanner and return.
      > Am I doing something wrong?
      > Still StackOverflowError is not supposed to happen right?
      >
      > Regards
      > David Alves
      > On Thu, 2008-03-27 at 12:36 -0700, stack wrote:
      >> You are using a filter? If so, tell us more about it.
      >> St.Ack
      >>
      >> David Alves wrote:
      >>> Hi guys
      >>>
      >>> I 'm using HBase to keep data that is later indexed.
      >>> The data is indexed in chunks so the cycle is get XXXX records index
      >>> them check for more records etc...
      >>> When I tryed the candidate-2 instead of the old 0.16.0 (which I
      >>> switched to do to the regionservers becoming unresponsive) I got the
      >>> error in the end of this email well into an indexing job.
      >>> So you have any idea why? Am I doing something wrong?
      >>>
      >>> David Alves
      >>>
      >>> java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException:
      >>> java.io.IOException: java.lang.StackOverflowError
      >>> at java.io.DataInputStream.readFully(DataInputStream.java:178)
      >>> at java.io.DataInputStream.readLong(DataInputStream.java:399)
      >>> at org.apache.hadoop.dfs.DFSClient
      >>> $BlockReader.readChunk(DFSClient.java:735)
      >>> at
      >>> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:234)
      >>> at
      >>> org.apache.hadoop.fs.FSInputChecker.fill(FSInputChecker.java:176)
      >>> at
      >>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:193)
      >>> at
      >>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:157)
      >>> at org.apache.hadoop.dfs.DFSClient
      >>> $BlockReader.read(DFSClient.java:658)
      >>> at org.apache.hadoop.dfs.DFSClient
      >>> $DFSInputStream.readBuffer(DFSClient.java:1130)
      >>> at org.apache.hadoop.dfs.DFSClient
      >>> $DFSInputStream.read(DFSClient.java:1166)
      >>> at java.io.DataInputStream.readFully(DataInputStream.java:178)
      >>> at org.apache.hadoop.io.DataOutputBuffer
      >>> $Buffer.write(DataOutputBuffer.java:56)
      >>> at
      >>> org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:90)
      >>> at org.apache.hadoop.io.SequenceFile
      >>> $Reader.next(SequenceFile.java:1829)
      >>> at org.apache.hadoop.io.SequenceFile
      >>> $Reader.next(SequenceFile.java:1729)
      >>> at org.apache.hadoop.io.SequenceFile
      >>> $Reader.next(SequenceFile.java:1775)
      >>> at org.apache.hadoop.io.MapFile$Reader.next(MapFile.java:461)
      >>> at org.apache.hadoop.hbase.HStore
      >>> $StoreFileScanner.getNext(HStore.java:2350)
      >>> at
      >>> org.apache.hadoop.hbase.HAbstractScanner.next(HAbstractScanner.java:256)
      >>> at org.apache.hadoop.hbase.HStore
      >>> $HStoreScanner.next(HStore.java:2561)
      >>> at org.apache.hadoop.hbase.HRegion
      >>> $HScanner.next(HRegion.java:1807)
      >>> at org.apache.hadoop.hbase.HRegion
      >>> $HScanner.next(HRegion.java:1843)
      >>> at org.apache.hadoop.hbase.HRegion
      >>> $HScanner.next(HRegion.java:1843)
      >>> at org.apache.hadoop.hbase.HRegion
      >>> $HScanner.next(HRegion.java:1843)
      >>> at org.apache.hadoop.hbase.HRegion
      >>> $HScanner.next(HRegion.java:1843)
      >>> at org.apache.hadoop.hbase.HRegion
      >>> $HScanner.next(HRegion.java:1843)
      >>> at org.apache.hadoop.hbase.HRegion
      >>> $HScanner.next(HRegion.java:1843)
      >>> at org.apache.hadoop.hbase.HRegion
      >>> $HScanner.next(HRegion.java:1843)
      >>> at org.apache.hadoop.hbase.HRegion
      >>> $HScanner.next(HRegion.java:1843)
      >>> at org.apache.hadoop.hbase.HRegion
      >>> $HScanner.next(HRegion.java:1843)
      >>> at org.apache.hadoop.hbase.HRegion
      >>> $HScanner.next(HRegion.java:1843)
      >>> at org.apache.hadoop.hbase.HRegion
      >>> $HScanner.next(HRegion.java:1843)
      >>> at org.apache.hadoop.hbase.HRegion
      >>> $HScanner.next(HRegion.java:1843)
      >>> at org.apache.hadoop.hbase.HRegion
      >>> $HScanner.next(HRegion.java:1843)
      >>> at org.apache.hadoop.hbase.HRegion
      >>> $HScanner.next(HRegion.java:1843)
      >>> at org.apache.hadoop.hbase.HRegion
      >>> $HScanner.next(HRegion.java:1843)
      >>> at org.apache.hadoop.hbase.HRegion
      >>> $HScanner.next(HRegion.java:1843)
      >>> at org.apache.hadoop.hbase.HRegion
      >>> $HScanner.next(HRegion.java:1843)
      >>> ...

      Attachments

        1. hbase-554.patch
          5 kB
          Clint Morgan
        2. hbase-554-v2.patch
          5 kB
          Clint Morgan

        Activity

          People

            jimk Jim Kellerman
            stack Michael Stack
            Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: