Uploaded image for project: 'HBase'
  1. HBase
  2. HBASE-27052

TestAsyncTableScanner.testScanWrongColumnFamily is flaky

    XMLWordPrintableJSON

Details

    • Test
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.5.0
    • 2.5.0, 3.0.0-alpha-3
    • test
    • None
    • Reviewed

    Description

      ndimiduk have you seen something like this before?

      [ERROR] org.apache.hadoop.hbase.client.TestAsyncTableScanner.testScanWrongColumnFamily[3: table=raw, scan=batchSmallResultSize]  Time elapsed: 0.018 s  <<< FAILURE!
      java.lang.AssertionError: 
      Expected:
      a collection containing (SpanKind with a name that a string starting with "SCAN default:async" and
      SpanKind with a parentSpanId that "612b6a689e063c9b" and
      SpanData with StatusCode that is <ERROR> and
      SpanData having Exception with Attributes that Attributes containing [is <exception.type>->a string ending with "org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException"] and
      SpanData that hasEnded)
      but:
      SpanKind with a name that a string starting with "SCAN default:async" name was
      "AsyncRegionLocator.getRegionLocation", SpanKind with a name that a string starting with "SCAN default:async" name was
      "Region.getScanner", SpanKind with a name that a string starting with "SCAN default:async" name was
      "hbase.pb.ClientService/Scan", SpanKind with a name that a string starting with "SCAN default:async" name was
      "RpcServer.process", SpanData having Exception with Attributes that Attributes containing [is <exception.type>->a string ending with "org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException"] exception attributes
      Attributes was [<exception.message=org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException:
      Column family WrongColumnFamily does not exist in region async,,1652930099982.5fe23e4a8e57156da060c1ecfade4b97. in table 'async',
      {TABLE_ATTRIBUTES => {METADATA => {'hbase.store.file-tracker.impl' => 'DEFAULT'}}}, {NAME => 'cf', BLOOMFILTER => 'ROW', IN_MEMORY => 'false', VERSIONS => '1', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'NONE', COMPRESSION => 'NONE', TTL => 'FOREVER', MIN_VERSIONS => '0', BLOCKCACHE => 'true', BLOCKSIZE => '65536 B (64KB)', REPLICATION_SCOPE => '0'}
      	at org.apache.hadoop.hbase.regionserver.HRegion.checkFamily(HRegion.java:7831)
      	at org.apache.hadoop.hbase.regionserver.HRegion.lambda$getScanner$3(HRegion.java:3013)
      	at org.apache.hadoop.hbase.trace.TraceUtil.trace(TraceUtil.java:185)
      	at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:3002)
      	at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2997)
      	at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2991)
      	at org.apache.hadoop.hbase.regionserver.RSRpcServices.newRegionScanner(RSRpcServices.java:3187)
      	at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:3544)
      	at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:45819)
      	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:385)
      	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:124)
      	at org.apache.hadoop.hbase.ipc.RpcHandler.run(RpcHandler.java:104)
      	at org.apache.hadoop.hbase.ipc.RpcHandler.run(RpcHandler.java:84)
      >, <exception.stacktrace=org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException):
      org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException:
      Column family WrongColumnFamily does not exist in region
      async,,1652930099982.5fe23e4a8e57156da060c1ecfade4b97. in table 'async',
      {TABLE_ATTRIBUTES => {METADATA => {'hbase.store.file-tracker.impl' => 'DEFAULT'}}}, {NAME => 'cf', BLOOMFILTER => 'ROW', IN_MEMORY => 'false', VERSIONS => '1', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'NONE', COMPRESSION => 'NONE', TTL => 'FOREVER', MIN_VERSIONS => '0', BLOCKCACHE => 'true', BLOCKSIZE => '65536 B (64KB)', REPLICATION_SCOPE => '0'}
      	at org.apache.hadoop.hbase.regionserver.HRegion.checkFamily(HRegion.java:7831)
      	at org.apache.hadoop.hbase.regionserver.HRegion.lambda$getScanner$3(HRegion.java:3013)
      	at org.apache.hadoop.hbase.trace.TraceUtil.trace(TraceUtil.java:185)
      	at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:3002)
      	at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2997)
      	at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:2991)
      	at org.apache.hadoop.hbase.regionserver.RSRpcServices.newRegionScanner(RSRpcServices.java:3187)
      	at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:3544)
      	at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:45819)
      	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:385)
      	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:124)
      	at org.apache.hadoop.hbase.ipc.RpcHandler.run(RpcHandler.java:104)
      	at org.apache.hadoop.hbase.ipc.RpcHandler.run(RpcHandler.java:84)
      
      	at org.apache.hadoop.hbase.ipc.AbstractRpcClient.onCallFinished(AbstractRpcClient.java:385)
      	at org.apache.hadoop.hbase.ipc.AbstractRpcClient.access$100(AbstractRpcClient.java:92)
      	at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:422)
      	at org.apache.hadoop.hbase.ipc.AbstractRpcClient$3.run(AbstractRpcClient.java:417)
      	at org.apache.hadoop.hbase.ipc.Call.callComplete(Call.java:114)
      	at org.apache.hadoop.hbase.ipc.Call.setException(Call.java:129)
      	at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.readResponse(NettyRpcDuplexHandler.java:166)
      	at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.channelRead(NettyRpcDuplexHandler.java:196)
      	at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
      	at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
      	at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
      	at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:327)
      	at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:299)
      	at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
      	at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
      	at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
      	at org.apache.hbase.thirdparty.io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)
      	at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
      	at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
      	at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
      	at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
      	at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
      	at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
      	at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
      	at org.apache.hbase.thirdparty.io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
      	at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
      	at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
      	at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
      	at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
      	at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
      	at org.apache.hbase.thirdparty.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
      	at org.apache.hbase.thirdparty.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
      	at java.lang.Thread.run(Thread.java:750)
      >,
      <exception.type=org.apache.hadoop.hbase.ipc.RemoteWithExtrasException>], 
      SpanKind with a name that a string starting with "SCAN default:async" name was
      "hbase.pb.ClientService/Scan",
      SpanKind with a name that a string starting with "SCAN default:async" name was
      "testScanWrongColumnFamily[3: table=raw, scan=batchSmallResultSize]"
      	at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:20)
      	at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:8)
      	at org.apache.hadoop.hbase.client.TestAsyncTableScanner.assertTraceError(TestAsyncTableScanner.java:150)
      	at org.apache.hadoop.hbase.client.AbstractTestAsyncTableScan.testScanWrongColumnFamily(AbstractTestAsyncTableScan.java:272)
      

      Attachments

        Activity

          People

            ndimiduk Nick Dimiduk
            apurtell Andrew Kyle Purtell
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: