HBase
  1. HBase
  2. HBASE-5853

java.lang.RuntimeException: readObject can't find class org.apache.hadoop.hdfs.protocol.HdfsFileStatus

    Details

    • Type: Bug Bug
    • Status: Closed
    • Priority: Major Major
    • Resolution: Invalid
    • Affects Version/s: 0.92.1
    • Fix Version/s: 0.92.1, 0.94.1
    • Component/s: regionserver
    • Labels:
      None
    • Environment:

      hadoop-0.23.1 hbase-0.92.1

      Description

      2012-04-23 12:51:07,474 WARN org.apache.hadoop.ipc.Client: Unexpected error reading responses on connection Thread[IPC Client (1260987126) connection to server121/172.16.40.121:9000 from smp,5,main]
      java.lang.RuntimeException: readObject can't find class org.apache.hadoop.hdfs.protocol.HdfsFileStatus
      at org.apache.hadoop.io.ObjectWritable.loadClass(ObjectWritable.java:372)
      at org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:223)
      at org.apache.hadoop.io.ObjectWritable.readFields(ObjectWritable.java:75)
      at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:832)
      at org.apache.hadoop.ipc.Client$Connection.run(Client.java:756)
      Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.hdfs.protocol.HdfsFileStatus not found
      at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1151)
      at org.apache.hadoop.io.ObjectWritable.loadClass(ObjectWritable.java:368)
      ... 4 more
      2012-04-23 12:51:07,797 FATAL org.apache.hadoop.hbase.regionserver.HRegionServer: ABORTING region server server124,60020,1335152900476: Replay of HLog required. Forcing server shutdown
      org.apache.hadoop.hbase.DroppedSnapshotException: region: hbase_cdr,e0072b2b-5e19-431f-bb69-a6427765eac4,1334902272934.8365a7cbf90dd558f297d70224113c8a.
      at org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1278)
      at org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1162)
      at org.apache.hadoop.hbase.regionserver.HRegion.flushcache(HRegion.java:1104)
      at org.apache.hadoop.hbase.regionserver.MemStoreFlusher.flushRegion(MemStoreFlusher.java:400)
      at org.apache.hadoop.hbase.regionserver.MemStoreFlusher.flushOneForGlobalPressure(MemStoreFlusher.java:202)
      at org.apache.hadoop.hbase.regionserver.MemStoreFlusher.run(MemStoreFlusher.java:223)
      at java.lang.Thread.run(Thread.java:662)
      Caused by: java.io.IOException: Failed on local exception: java.io.IOException: Error reading responses; Host Details : local host is: "server124/172.16.40.124"; destination host is: ""server121":9000;
      at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:724)
      at org.apache.hadoop.ipc.Client.call(Client.java:1094)
      at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:193)
      at $Proxy10.getFileInfo(Unknown Source)
      at sun.reflect.GeneratedMethodAccessor15.invoke(Unknown Source)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
      at java.lang.reflect.Method.invoke(Method.java:597)
      at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:100)
      at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:65)
      at $Proxy10.getFileInfo(Unknown Source)
      at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1172)
      at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:725)
      at org.apache.hadoop.hbase.regionserver.StoreFile.computeHDFSBlockDistribution(StoreFile.java:449)
      at org.apache.hadoop.hbase.regionserver.StoreFile.open(StoreFile.java:473)
      at org.apache.hadoop.hbase.regionserver.StoreFile.createReader(StoreFile.java:548)
      at org.apache.hadoop.hbase.regionserver.Store.internalFlushCache(Store.java:595)
      at org.apache.hadoop.hbase.regionserver.Store.flushCache(Store.java:506)
      at org.apache.hadoop.hbase.regionserver.Store.access$100(Store.java:89)
      at org.apache.hadoop.hbase.regionserver.Store$StoreFlusherImpl.flushCache(Store.java:1905)
      at org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1254)
      ... 6 more
      Caused by: java.io.IOException: Error reading responses
      at org.apache.hadoop.ipc.Client$Connection.run(Client.java:763)
      Caused by: java.lang.RuntimeException: readObject can't find class org.apache.hadoop.hdfs.protocol.HdfsFileStatus
      at org.apache.hadoop.io.ObjectWritable.loadClass(ObjectWritable.java:372)
      at org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:223)
      at org.apache.hadoop.io.ObjectWritable.readFields(ObjectWritable.java:75)
      at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:832)
      at org.apache.hadoop.ipc.Client$Connection.run(Client.java:756)
      Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.hdfs.protocol.HdfsFileStatus not found
      at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1151)
      at org.apache.hadoop.io.ObjectWritable.loadClass(ObjectWritable.java:368)
      ... 4 more

        Activity

        Hide
        Lars Hofhansl added a comment - - edited

        Did you recompile your version of 0.92.1 against Hadoop 0.23?

        Show
        Lars Hofhansl added a comment - - edited Did you recompile your version of 0.92.1 against Hadoop 0.23?
        Hide
        jiafeng.zhang added a comment -

        Recompiled
        mvn clean site install assembly:single -Dmaven.test.skip –Prelease -Dhadoop.profile=23

        Show
        jiafeng.zhang added a comment - Recompiled mvn clean site install assembly:single -Dmaven.test.skip –Prelease -Dhadoop.profile=23
        Hide
        Jonathan Hsieh added a comment - - edited

        Are you running this (your client) as processes or as part of unit tests?

        When you run 'hbase classpath' is hadoop-hdfs-*.jar part of your path or hadoop-core-*.jar part of your path?

        Show
        Jonathan Hsieh added a comment - - edited Are you running this (your client) as processes or as part of unit tests? When you run 'hbase classpath' is hadoop-hdfs-*.jar part of your path or hadoop-core-*.jar part of your path?
        Hide
        Lars Hofhansl added a comment -

        I agree that looks like a setup problem, rather than an HBase problem.

        Show
        Lars Hofhansl added a comment - I agree that looks like a setup problem, rather than an HBase problem.
        Hide
        Lars Hofhansl added a comment -

        Had anybody else seen this?
        Seem any HBase flushCache on top of Hadoop 0.23 should cause this.

        Show
        Lars Hofhansl added a comment - Had anybody else seen this? Seem any HBase flushCache on top of Hadoop 0.23 should cause this.
        Hide
        Lars Hofhansl added a comment -

        @jiafeng: Can you tell us what exactly you did when this happened?

        Show
        Lars Hofhansl added a comment - @jiafeng: Can you tell us what exactly you did when this happened?
        Hide
        Lars Hofhansl added a comment -

        Moving this out until somebody confirms that it is serious problem.

        Show
        Lars Hofhansl added a comment - Moving this out until somebody confirms that it is serious problem.
        Hide
        jiafeng.zhang added a comment -

        @Lars Hofhansl : import the data into hbase

        Show
        jiafeng.zhang added a comment - @Lars Hofhansl : import the data into hbase
        Hide
        jiafeng.zhang added a comment -

        During operation exception

        Show
        jiafeng.zhang added a comment - During operation exception
        Hide
        Lars Hofhansl added a comment -

        I verified that org.apache.hadoop.hdfs.protocol.HdfsFileStatus still exists in Hadoop trunk.
        So I am pretty sure this is a setup issue.
        Please re-open if this is not correct (in which case I apologize in advance).

        Show
        Lars Hofhansl added a comment - I verified that org.apache.hadoop.hdfs.protocol.HdfsFileStatus still exists in Hadoop trunk. So I am pretty sure this is a setup issue. Please re-open if this is not correct (in which case I apologize in advance).

          People

          • Assignee:
            Unassigned
            Reporter:
            jiafeng.zhang
          • Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

            • Created:
              Updated:
              Resolved:

              Development