Test Info
| Library Name | HBase |
| Version #1 | 0.98.9 |
| Version #2 | branch-1.0 |
| Java Version | 1.7.0_60 |
| Subject | Binary Compatibility |
Test Results
| Total Java ARchives | 15 |
|---|
| Total Methods / Classes | 1572 / 3953 |
|---|
| Verdict | Incompatible (15.4%) |
Problem Summary
| Severity | Count |
|---|
| Added Methods | - | 197 |
|---|
| Removed Methods | High | 35 |
|---|
Problems with Data Types | High | 21 |
|---|
| Medium | 5 |
| Low | 25 |
Problems with Methods | High | 12 |
|---|
| Medium | 0 |
| Low | 0 |
Other Changes in Data Types | - | 34 |
Added Methods (197)
hbase-client-1.0.0.jar,
AccessControlClient.class
package org.apache.hadoop.hbase.security.access
AccessControlClient.getUserPermissions ( org.apache.hadoop.hbase.client.Connection connection, String tableRegex ) [static] : java.util.List<UserPermission>
[mangled: org/apache/hadoop/hbase/security/access/AccessControlClient.getUserPermissions:(Lorg/apache/hadoop/hbase/client/Connection;Ljava/lang/String;)Ljava/util/List;]
hbase-client-1.0.0.jar,
Append.class
package org.apache.hadoop.hbase.client
Append.setACL ( java.util.Map x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Append.setACL:(Ljava/util/Map;)Lorg/apache/hadoop/hbase/client/Mutation;]
Append.setACL ( java.util.Map<String,org.apache.hadoop.hbase.security.access.Permission> perms ) : Append
[mangled: org/apache/hadoop/hbase/client/Append.setACL:(Ljava/util/Map;)Lorg/apache/hadoop/hbase/client/Append;]
Append.setACL ( String user, org.apache.hadoop.hbase.security.access.Permission perms ) : Append
[mangled: org/apache/hadoop/hbase/client/Append.setACL:(Ljava/lang/String;Lorg/apache/hadoop/hbase/security/access/Permission;)Lorg/apache/hadoop/hbase/client/Append;]
Append.setACL ( String x0, org.apache.hadoop.hbase.security.access.Permission x1 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Append.setACL:(Ljava/lang/String;Lorg/apache/hadoop/hbase/security/access/Permission;)Lorg/apache/hadoop/hbase/client/Mutation;]
Append.setAttribute ( String name, byte[ ] value ) : Append
[mangled: org/apache/hadoop/hbase/client/Append.setAttribute:(Ljava/lang/String;[B)Lorg/apache/hadoop/hbase/client/Append;]
Append.setAttribute ( String x0, byte[ ] x1 ) : Attributes
[mangled: org/apache/hadoop/hbase/client/Append.setAttribute:(Ljava/lang/String;[B)Lorg/apache/hadoop/hbase/client/Attributes;]
Append.setAttribute ( String x0, byte[ ] x1 ) : OperationWithAttributes
[mangled: org/apache/hadoop/hbase/client/Append.setAttribute:(Ljava/lang/String;[B)Lorg/apache/hadoop/hbase/client/OperationWithAttributes;]
Append.setCellVisibility ( org.apache.hadoop.hbase.security.visibility.CellVisibility expression ) : Append
[mangled: org/apache/hadoop/hbase/client/Append.setCellVisibility:(Lorg/apache/hadoop/hbase/security/visibility/CellVisibility;)Lorg/apache/hadoop/hbase/client/Append;]
Append.setCellVisibility ( org.apache.hadoop.hbase.security.visibility.CellVisibility x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Append.setCellVisibility:(Lorg/apache/hadoop/hbase/security/visibility/CellVisibility;)Lorg/apache/hadoop/hbase/client/Mutation;]
Append.setClusterIds ( java.util.List x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Append.setClusterIds:(Ljava/util/List;)Lorg/apache/hadoop/hbase/client/Mutation;]
Append.setClusterIds ( java.util.List<java.util.UUID> clusterIds ) : Append
[mangled: org/apache/hadoop/hbase/client/Append.setClusterIds:(Ljava/util/List;)Lorg/apache/hadoop/hbase/client/Append;]
Append.setDurability ( Durability d ) : Append
[mangled: org/apache/hadoop/hbase/client/Append.setDurability:(Lorg/apache/hadoop/hbase/client/Durability;)Lorg/apache/hadoop/hbase/client/Append;]
Append.setDurability ( Durability x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Append.setDurability:(Lorg/apache/hadoop/hbase/client/Durability;)Lorg/apache/hadoop/hbase/client/Mutation;]
Append.setFamilyCellMap ( java.util.NavigableMap x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Append.setFamilyCellMap:(Ljava/util/NavigableMap;)Lorg/apache/hadoop/hbase/client/Mutation;]
Append.setFamilyCellMap ( java.util.NavigableMap<byte[ ],java.util.List<org.apache.hadoop.hbase.Cell>> map ) : Append
[mangled: org/apache/hadoop/hbase/client/Append.setFamilyCellMap:(Ljava/util/NavigableMap;)Lorg/apache/hadoop/hbase/client/Append;]
Append.setFamilyMap ( java.util.NavigableMap x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Append.setFamilyMap:(Ljava/util/NavigableMap;)Lorg/apache/hadoop/hbase/client/Mutation;]
Append.setFamilyMap ( java.util.NavigableMap<byte[ ],java.util.List<org.apache.hadoop.hbase.KeyValue>> map ) : Append *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/Append.setFamilyMap:(Ljava/util/NavigableMap;)Lorg/apache/hadoop/hbase/client/Append;]
Append.setId ( String id ) : Append
[mangled: org/apache/hadoop/hbase/client/Append.setId:(Ljava/lang/String;)Lorg/apache/hadoop/hbase/client/Append;]
Append.setId ( String x0 ) : OperationWithAttributes
[mangled: org/apache/hadoop/hbase/client/Append.setId:(Ljava/lang/String;)Lorg/apache/hadoop/hbase/client/OperationWithAttributes;]
Append.setReturnResults ( boolean returnResults ) : Append
[mangled: org/apache/hadoop/hbase/client/Append.setReturnResults:(Z)Lorg/apache/hadoop/hbase/client/Append;]
Append.setTTL ( long ttl ) : Append
[mangled: org/apache/hadoop/hbase/client/Append.setTTL:(J)Lorg/apache/hadoop/hbase/client/Append;]
Append.setTTL ( long x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Append.setTTL:(J)Lorg/apache/hadoop/hbase/client/Mutation;]
Append.setWriteToWAL ( boolean write ) : Append *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/Append.setWriteToWAL:(Z)Lorg/apache/hadoop/hbase/client/Append;]
Append.setWriteToWAL ( boolean x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Append.setWriteToWAL:(Z)Lorg/apache/hadoop/hbase/client/Mutation;]
hbase-client-1.0.0.jar,
Attributes.class
package org.apache.hadoop.hbase.client
Attributes.setAttribute ( String p1, byte[ ] p2 ) [abstract] : Attributes
[mangled: org/apache/hadoop/hbase/client/Attributes.setAttribute:(Ljava/lang/String;[B)Lorg/apache/hadoop/hbase/client/Attributes;]
hbase-client-1.0.0.jar,
CompareFilter.class
package org.apache.hadoop.hbase.filter
CompareFilter.transformCell ( org.apache.hadoop.hbase.Cell v ) : org.apache.hadoop.hbase.Cell
[mangled: org/apache/hadoop/hbase/filter/CompareFilter.transformCell:(Lorg/apache/hadoop/hbase/Cell;)Lorg/apache/hadoop/hbase/Cell;]
hbase-client-1.0.0.jar,
Connection.class
package org.apache.hadoop.hbase.client
Connection.close ( ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Connection.close:()V]
Connection.getAdmin ( ) [abstract] : Admin
[mangled: org/apache/hadoop/hbase/client/Connection.getAdmin:()Lorg/apache/hadoop/hbase/client/Admin;]
Connection.getConfiguration ( ) [abstract] : org.apache.hadoop.conf.Configuration
[mangled: org/apache/hadoop/hbase/client/Connection.getConfiguration:()Lorg/apache/hadoop/conf/Configuration;]
Connection.getRegionLocator ( org.apache.hadoop.hbase.TableName p1 ) [abstract] : RegionLocator
[mangled: org/apache/hadoop/hbase/client/Connection.getRegionLocator:(Lorg/apache/hadoop/hbase/TableName;)Lorg/apache/hadoop/hbase/client/RegionLocator;]
Connection.getTable ( org.apache.hadoop.hbase.TableName p1 ) [abstract] : Table
[mangled: org/apache/hadoop/hbase/client/Connection.getTable:(Lorg/apache/hadoop/hbase/TableName;)Lorg/apache/hadoop/hbase/client/Table;]
Connection.getTable ( org.apache.hadoop.hbase.TableName p1, java.util.concurrent.ExecutorService p2 ) [abstract] : Table
[mangled: org/apache/hadoop/hbase/client/Connection.getTable:(Lorg/apache/hadoop/hbase/TableName;Ljava/util/concurrent/ExecutorService;)Lorg/apache/hadoop/hbase/client/Table;]
Connection.isClosed ( ) [abstract] : boolean
[mangled: org/apache/hadoop/hbase/client/Connection.isClosed:()Z]
hbase-client-1.0.0.jar,
Consistency.class
package org.apache.hadoop.hbase.client
Consistency.valueOf ( String name ) [static] : Consistency
[mangled: org/apache/hadoop/hbase/client/Consistency.valueOf:(Ljava/lang/String;)Lorg/apache/hadoop/hbase/client/Consistency;]
Consistency.values ( ) [static] : Consistency[ ]
[mangled: org/apache/hadoop/hbase/client/Consistency.values:()[Lorg/apache/hadoop/hbase/client/Consistency;]
hbase-client-1.0.0.jar,
FuzzyRowFilter.class
package org.apache.hadoop.hbase.filter
FuzzyRowFilter.transformCell ( org.apache.hadoop.hbase.Cell v ) : org.apache.hadoop.hbase.Cell
[mangled: org/apache/hadoop/hbase/filter/FuzzyRowFilter.transformCell:(Lorg/apache/hadoop/hbase/Cell;)Lorg/apache/hadoop/hbase/Cell;]
hbase-client-1.0.0.jar,
HConnection.class
package org.apache.hadoop.hbase.client
HConnection.getAdmin ( ) [abstract] : Admin
[mangled: org/apache/hadoop/hbase/client/HConnection.getAdmin:()Lorg/apache/hadoop/hbase/client/Admin;]
HConnection.getRegionLocator ( org.apache.hadoop.hbase.TableName p1 ) [abstract] : RegionLocator
[mangled: org/apache/hadoop/hbase/client/HConnection.getRegionLocator:(Lorg/apache/hadoop/hbase/TableName;)Lorg/apache/hadoop/hbase/client/RegionLocator;]
HConnection.updateCachedLocations ( org.apache.hadoop.hbase.TableName p1, byte[ ] p2, byte[ ] p3, Object p4, org.apache.hadoop.hbase.ServerName p5 ) [abstract] : void *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/HConnection.updateCachedLocations:(Lorg/apache/hadoop/hbase/TableName;[B[BLjava/lang/Object;Lorg/apache/hadoop/hbase/ServerName;)V]
hbase-client-1.0.0.jar,
HRegionInfo.class
package org.apache.hadoop.hbase
HRegionInfo.createRegionName ( TableName tableName, byte[ ] startKey, byte[ ] id, int replicaId, boolean newFormat ) [static] : byte[ ]
[mangled: org/apache/hadoop/hbase/HRegionInfo.createRegionName:(Lorg/apache/hadoop/hbase/TableName;[B[BIZ)[B]
HRegionInfo.createRegionName ( TableName tableName, byte[ ] startKey, long regionid, int replicaId, boolean newFormat ) [static] : byte[ ]
[mangled: org/apache/hadoop/hbase/HRegionInfo.createRegionName:(Lorg/apache/hadoop/hbase/TableName;[BJIZ)[B]
HRegionInfo.getReplicaId ( ) : int
[mangled: org/apache/hadoop/hbase/HRegionInfo.getReplicaId:()I]
HRegionInfo.HRegionInfo ( HRegionInfo other, int replicaId )
[mangled: org/apache/hadoop/hbase/HRegionInfo."<init>":(Lorg/apache/hadoop/hbase/HRegionInfo;I)V]
HRegionInfo.HRegionInfo ( TableName tableName, byte[ ] startKey, byte[ ] endKey, boolean split, long regionid, int replicaId )
[mangled: org/apache/hadoop/hbase/HRegionInfo."<init>":(Lorg/apache/hadoop/hbase/TableName;[B[BZJI)V]
HRegionInfo.isSystemTable ( ) : boolean
[mangled: org/apache/hadoop/hbase/HRegionInfo.isSystemTable:()Z]
hbase-client-1.0.0.jar,
PrefixFilter.class
package org.apache.hadoop.hbase.filter
PrefixFilter.transformCell ( org.apache.hadoop.hbase.Cell v ) : org.apache.hadoop.hbase.Cell
[mangled: org/apache/hadoop/hbase/filter/PrefixFilter.transformCell:(Lorg/apache/hadoop/hbase/Cell;)Lorg/apache/hadoop/hbase/Cell;]
hbase-client-1.0.0.jar,
Put.class
package org.apache.hadoop.hbase.client
Put.setACL ( java.util.Map x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Put.setACL:(Ljava/util/Map;)Lorg/apache/hadoop/hbase/client/Mutation;]
Put.setACL ( java.util.Map<String,org.apache.hadoop.hbase.security.access.Permission> perms ) : Put
[mangled: org/apache/hadoop/hbase/client/Put.setACL:(Ljava/util/Map;)Lorg/apache/hadoop/hbase/client/Put;]
Put.setACL ( String user, org.apache.hadoop.hbase.security.access.Permission perms ) : Put
[mangled: org/apache/hadoop/hbase/client/Put.setACL:(Ljava/lang/String;Lorg/apache/hadoop/hbase/security/access/Permission;)Lorg/apache/hadoop/hbase/client/Put;]
Put.setACL ( String x0, org.apache.hadoop.hbase.security.access.Permission x1 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Put.setACL:(Ljava/lang/String;Lorg/apache/hadoop/hbase/security/access/Permission;)Lorg/apache/hadoop/hbase/client/Mutation;]
Put.setAttribute ( String name, byte[ ] value ) : Put
[mangled: org/apache/hadoop/hbase/client/Put.setAttribute:(Ljava/lang/String;[B)Lorg/apache/hadoop/hbase/client/Put;]
Put.setAttribute ( String x0, byte[ ] x1 ) : Attributes
[mangled: org/apache/hadoop/hbase/client/Put.setAttribute:(Ljava/lang/String;[B)Lorg/apache/hadoop/hbase/client/Attributes;]
Put.setAttribute ( String x0, byte[ ] x1 ) : OperationWithAttributes
[mangled: org/apache/hadoop/hbase/client/Put.setAttribute:(Ljava/lang/String;[B)Lorg/apache/hadoop/hbase/client/OperationWithAttributes;]
Put.setCellVisibility ( org.apache.hadoop.hbase.security.visibility.CellVisibility expression ) : Put
[mangled: org/apache/hadoop/hbase/client/Put.setCellVisibility:(Lorg/apache/hadoop/hbase/security/visibility/CellVisibility;)Lorg/apache/hadoop/hbase/client/Put;]
Put.setCellVisibility ( org.apache.hadoop.hbase.security.visibility.CellVisibility x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Put.setCellVisibility:(Lorg/apache/hadoop/hbase/security/visibility/CellVisibility;)Lorg/apache/hadoop/hbase/client/Mutation;]
Put.setClusterIds ( java.util.List x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Put.setClusterIds:(Ljava/util/List;)Lorg/apache/hadoop/hbase/client/Mutation;]
Put.setClusterIds ( java.util.List<java.util.UUID> clusterIds ) : Put
[mangled: org/apache/hadoop/hbase/client/Put.setClusterIds:(Ljava/util/List;)Lorg/apache/hadoop/hbase/client/Put;]
Put.setDurability ( Durability d ) : Put
[mangled: org/apache/hadoop/hbase/client/Put.setDurability:(Lorg/apache/hadoop/hbase/client/Durability;)Lorg/apache/hadoop/hbase/client/Put;]
Put.setDurability ( Durability x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Put.setDurability:(Lorg/apache/hadoop/hbase/client/Durability;)Lorg/apache/hadoop/hbase/client/Mutation;]
Put.setFamilyCellMap ( java.util.NavigableMap x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Put.setFamilyCellMap:(Ljava/util/NavigableMap;)Lorg/apache/hadoop/hbase/client/Mutation;]
Put.setFamilyCellMap ( java.util.NavigableMap<byte[ ],java.util.List<org.apache.hadoop.hbase.Cell>> map ) : Put
[mangled: org/apache/hadoop/hbase/client/Put.setFamilyCellMap:(Ljava/util/NavigableMap;)Lorg/apache/hadoop/hbase/client/Put;]
Put.setFamilyMap ( java.util.NavigableMap x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Put.setFamilyMap:(Ljava/util/NavigableMap;)Lorg/apache/hadoop/hbase/client/Mutation;]
Put.setFamilyMap ( java.util.NavigableMap<byte[ ],java.util.List<org.apache.hadoop.hbase.KeyValue>> map ) : Put *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/Put.setFamilyMap:(Ljava/util/NavigableMap;)Lorg/apache/hadoop/hbase/client/Put;]
Put.setId ( String id ) : Put
[mangled: org/apache/hadoop/hbase/client/Put.setId:(Ljava/lang/String;)Lorg/apache/hadoop/hbase/client/Put;]
Put.setId ( String x0 ) : OperationWithAttributes
[mangled: org/apache/hadoop/hbase/client/Put.setId:(Ljava/lang/String;)Lorg/apache/hadoop/hbase/client/OperationWithAttributes;]
Put.setTTL ( long ttl ) : Put
[mangled: org/apache/hadoop/hbase/client/Put.setTTL:(J)Lorg/apache/hadoop/hbase/client/Put;]
Put.setTTL ( long x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Put.setTTL:(J)Lorg/apache/hadoop/hbase/client/Mutation;]
Put.setWriteToWAL ( boolean write ) : Put *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/Put.setWriteToWAL:(Z)Lorg/apache/hadoop/hbase/client/Put;]
Put.setWriteToWAL ( boolean x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Put.setWriteToWAL:(Z)Lorg/apache/hadoop/hbase/client/Mutation;]
hbase-client-1.0.0.jar,
Result.class
package org.apache.hadoop.hbase.client
Result.create ( java.util.List<org.apache.hadoop.hbase.Cell> cells, Boolean exists, boolean stale ) [static] : Result
[mangled: org/apache/hadoop/hbase/client/Result.create:(Ljava/util/List;Ljava/lang/Boolean;Z)Lorg/apache/hadoop/hbase/client/Result;]
Result.create ( org.apache.hadoop.hbase.Cell[ ] cells, Boolean exists, boolean stale ) [static] : Result
[mangled: org/apache/hadoop/hbase/client/Result.create:([Lorg/apache/hadoop/hbase/Cell;Ljava/lang/Boolean;Z)Lorg/apache/hadoop/hbase/client/Result;]
Result.getTotalSizeOfCells ( Result result ) [static] : long
[mangled: org/apache/hadoop/hbase/client/Result.getTotalSizeOfCells:(Lorg/apache/hadoop/hbase/client/Result;)J]
Result.isStale ( ) : boolean
[mangled: org/apache/hadoop/hbase/client/Result.isStale:()Z]
hbase-client-1.0.0.jar,
Scan.class
package org.apache.hadoop.hbase.client
Scan.setACL ( java.util.Map x0 ) : Query
[mangled: org/apache/hadoop/hbase/client/Scan.setACL:(Ljava/util/Map;)Lorg/apache/hadoop/hbase/client/Query;]
Scan.setACL ( java.util.Map<String,org.apache.hadoop.hbase.security.access.Permission> perms ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setACL:(Ljava/util/Map;)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setACL ( String user, org.apache.hadoop.hbase.security.access.Permission perms ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setACL:(Ljava/lang/String;Lorg/apache/hadoop/hbase/security/access/Permission;)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setACL ( String x0, org.apache.hadoop.hbase.security.access.Permission x1 ) : Query
[mangled: org/apache/hadoop/hbase/client/Scan.setACL:(Ljava/lang/String;Lorg/apache/hadoop/hbase/security/access/Permission;)Lorg/apache/hadoop/hbase/client/Query;]
Scan.setAttribute ( String name, byte[ ] value ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setAttribute:(Ljava/lang/String;[B)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setAttribute ( String x0, byte[ ] x1 ) : Attributes
[mangled: org/apache/hadoop/hbase/client/Scan.setAttribute:(Ljava/lang/String;[B)Lorg/apache/hadoop/hbase/client/Attributes;]
Scan.setAttribute ( String x0, byte[ ] x1 ) : OperationWithAttributes
[mangled: org/apache/hadoop/hbase/client/Scan.setAttribute:(Ljava/lang/String;[B)Lorg/apache/hadoop/hbase/client/OperationWithAttributes;]
Scan.setAuthorizations ( org.apache.hadoop.hbase.security.visibility.Authorizations authorizations ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setAuthorizations:(Lorg/apache/hadoop/hbase/security/visibility/Authorizations;)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setAuthorizations ( org.apache.hadoop.hbase.security.visibility.Authorizations x0 ) : Query
[mangled: org/apache/hadoop/hbase/client/Scan.setAuthorizations:(Lorg/apache/hadoop/hbase/security/visibility/Authorizations;)Lorg/apache/hadoop/hbase/client/Query;]
Scan.setBatch ( int batch ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setBatch:(I)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setCacheBlocks ( boolean cacheBlocks ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setCacheBlocks:(Z)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setCaching ( int caching ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setCaching:(I)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setConsistency ( Consistency consistency ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setConsistency:(Lorg/apache/hadoop/hbase/client/Consistency;)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setConsistency ( Consistency x0 ) : Query
[mangled: org/apache/hadoop/hbase/client/Scan.setConsistency:(Lorg/apache/hadoop/hbase/client/Consistency;)Lorg/apache/hadoop/hbase/client/Query;]
Scan.setId ( String id ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setId:(Ljava/lang/String;)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setId ( String x0 ) : OperationWithAttributes
[mangled: org/apache/hadoop/hbase/client/Scan.setId:(Ljava/lang/String;)Lorg/apache/hadoop/hbase/client/OperationWithAttributes;]
Scan.setIsolationLevel ( IsolationLevel level ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setIsolationLevel:(Lorg/apache/hadoop/hbase/client/IsolationLevel;)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setIsolationLevel ( IsolationLevel x0 ) : Query
[mangled: org/apache/hadoop/hbase/client/Scan.setIsolationLevel:(Lorg/apache/hadoop/hbase/client/IsolationLevel;)Lorg/apache/hadoop/hbase/client/Query;]
Scan.setLoadColumnFamiliesOnDemand ( boolean value ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setLoadColumnFamiliesOnDemand:(Z)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setMaxResultSize ( long maxResultSize ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setMaxResultSize:(J)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setMaxResultsPerColumnFamily ( int limit ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setMaxResultsPerColumnFamily:(I)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setRaw ( boolean raw ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setRaw:(Z)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setReplicaId ( int Id ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setReplicaId:(I)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setReplicaId ( int x0 ) : Query
[mangled: org/apache/hadoop/hbase/client/Scan.setReplicaId:(I)Lorg/apache/hadoop/hbase/client/Query;]
Scan.setRowOffsetPerColumnFamily ( int offset ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setRowOffsetPerColumnFamily:(I)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setRowPrefixFilter ( byte[ ] rowPrefix ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setRowPrefixFilter:([B)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setSmall ( boolean small ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setSmall:(Z)Lorg/apache/hadoop/hbase/client/Scan;]
hbase-client-1.0.0.jar,
Table.class
package org.apache.hadoop.hbase.client
Table.append ( Append p1 ) [abstract] : Result
[mangled: org/apache/hadoop/hbase/client/Table.append:(Lorg/apache/hadoop/hbase/client/Append;)Lorg/apache/hadoop/hbase/client/Result;]
Table.batch ( java.util.List<? extends Row> p1 ) [abstract] : Object[ ] *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/Table.batch:(Ljava/util/List;)[Ljava/lang/Object;]
Table.batch ( java.util.List<? extends Row> p1, Object[ ] p2 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.batch:(Ljava/util/List;[Ljava/lang/Object;)V]
Table.batchCallback ( java.util.List<? extends Row> p1, Object[ ] p2, coprocessor.Batch.Callback<R> p3 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.batchCallback:(Ljava/util/List;[Ljava/lang/Object;Lorg/apache/hadoop/hbase/client/coprocessor/Batch$Callback;)V]
Table.batchCallback ( java.util.List<? extends Row> p1, coprocessor.Batch.Callback<R> p2 ) [abstract] : Object[ ] *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/Table.batchCallback:(Ljava/util/List;Lorg/apache/hadoop/hbase/client/coprocessor/Batch$Callback;)[Ljava/lang/Object;]
Table.batchCoprocessorService ( com.google.protobuf.Descriptors.MethodDescriptor p1, com.google.protobuf.Message p2, byte[ ] p3, byte[ ] p4, R p5 ) [abstract] : java.util.Map<byte[ ],R>
[mangled: org/apache/hadoop/hbase/client/Table.batchCoprocessorService:(Lcom/google/protobuf/Descriptors$MethodDescriptor;Lcom/google/protobuf/Message;[B[BLcom/google/protobuf/Message;)Ljava/util/Map;]
Table.batchCoprocessorService ( com.google.protobuf.Descriptors.MethodDescriptor p1, com.google.protobuf.Message p2, byte[ ] p3, byte[ ] p4, R p5, coprocessor.Batch.Callback<R> p6 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.batchCoprocessorService:(Lcom/google/protobuf/Descriptors$MethodDescriptor;Lcom/google/protobuf/Message;[B[BLcom/google/protobuf/Message;Lorg/apache/hadoop/hbase/client/coprocessor/Batch$Callback;)V]
Table.checkAndDelete ( byte[ ] p1, byte[ ] p2, byte[ ] p3, byte[ ] p4, Delete p5 ) [abstract] : boolean
[mangled: org/apache/hadoop/hbase/client/Table.checkAndDelete:([B[B[B[BLorg/apache/hadoop/hbase/client/Delete;)Z]
Table.checkAndDelete ( byte[ ] p1, byte[ ] p2, byte[ ] p3, org.apache.hadoop.hbase.filter.CompareFilter.CompareOp p4, byte[ ] p5, Delete p6 ) [abstract] : boolean
[mangled: org/apache/hadoop/hbase/client/Table.checkAndDelete:([B[B[BLorg/apache/hadoop/hbase/filter/CompareFilter$CompareOp;[BLorg/apache/hadoop/hbase/client/Delete;)Z]
Table.checkAndMutate ( byte[ ] p1, byte[ ] p2, byte[ ] p3, org.apache.hadoop.hbase.filter.CompareFilter.CompareOp p4, byte[ ] p5, RowMutations p6 ) [abstract] : boolean
[mangled: org/apache/hadoop/hbase/client/Table.checkAndMutate:([B[B[BLorg/apache/hadoop/hbase/filter/CompareFilter$CompareOp;[BLorg/apache/hadoop/hbase/client/RowMutations;)Z]
Table.checkAndPut ( byte[ ] p1, byte[ ] p2, byte[ ] p3, byte[ ] p4, Put p5 ) [abstract] : boolean
[mangled: org/apache/hadoop/hbase/client/Table.checkAndPut:([B[B[B[BLorg/apache/hadoop/hbase/client/Put;)Z]
Table.checkAndPut ( byte[ ] p1, byte[ ] p2, byte[ ] p3, org.apache.hadoop.hbase.filter.CompareFilter.CompareOp p4, byte[ ] p5, Put p6 ) [abstract] : boolean
[mangled: org/apache/hadoop/hbase/client/Table.checkAndPut:([B[B[BLorg/apache/hadoop/hbase/filter/CompareFilter$CompareOp;[BLorg/apache/hadoop/hbase/client/Put;)Z]
Table.close ( ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.close:()V]
Table.coprocessorService ( byte[ ] p1 ) [abstract] : org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel
[mangled: org/apache/hadoop/hbase/client/Table.coprocessorService:([B)Lorg/apache/hadoop/hbase/ipc/CoprocessorRpcChannel;]
Table.coprocessorService ( Class<T> p1, byte[ ] p2, byte[ ] p3, coprocessor.Batch.Call<T,R> p4 ) [abstract] : java.util.Map<byte[ ],R>
[mangled: org/apache/hadoop/hbase/client/Table.coprocessorService:(Ljava/lang/Class;[B[BLorg/apache/hadoop/hbase/client/coprocessor/Batch$Call;)Ljava/util/Map;]
Table.coprocessorService ( Class<T> p1, byte[ ] p2, byte[ ] p3, coprocessor.Batch.Call<T,R> p4, coprocessor.Batch.Callback<R> p5 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.coprocessorService:(Ljava/lang/Class;[B[BLorg/apache/hadoop/hbase/client/coprocessor/Batch$Call;Lorg/apache/hadoop/hbase/client/coprocessor/Batch$Callback;)V]
Table.delete ( java.util.List<Delete> p1 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.delete:(Ljava/util/List;)V]
Table.delete ( Delete p1 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.delete:(Lorg/apache/hadoop/hbase/client/Delete;)V]
Table.exists ( Get p1 ) [abstract] : boolean
[mangled: org/apache/hadoop/hbase/client/Table.exists:(Lorg/apache/hadoop/hbase/client/Get;)Z]
Table.existsAll ( java.util.List<Get> p1 ) [abstract] : boolean[ ]
[mangled: org/apache/hadoop/hbase/client/Table.existsAll:(Ljava/util/List;)[Z]
Table.flushCommits ( ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.flushCommits:()V]
Table.get ( java.util.List<Get> p1 ) [abstract] : Result[ ]
[mangled: org/apache/hadoop/hbase/client/Table.get:(Ljava/util/List;)[Lorg/apache/hadoop/hbase/client/Result;]
Table.get ( Get p1 ) [abstract] : Result
[mangled: org/apache/hadoop/hbase/client/Table.get:(Lorg/apache/hadoop/hbase/client/Get;)Lorg/apache/hadoop/hbase/client/Result;]
Table.getConfiguration ( ) [abstract] : org.apache.hadoop.conf.Configuration
[mangled: org/apache/hadoop/hbase/client/Table.getConfiguration:()Lorg/apache/hadoop/conf/Configuration;]
Table.getName ( ) [abstract] : org.apache.hadoop.hbase.TableName
[mangled: org/apache/hadoop/hbase/client/Table.getName:()Lorg/apache/hadoop/hbase/TableName;]
Table.getScanner ( byte[ ] p1 ) [abstract] : ResultScanner
[mangled: org/apache/hadoop/hbase/client/Table.getScanner:([B)Lorg/apache/hadoop/hbase/client/ResultScanner;]
Table.getScanner ( byte[ ] p1, byte[ ] p2 ) [abstract] : ResultScanner
[mangled: org/apache/hadoop/hbase/client/Table.getScanner:([B[B)Lorg/apache/hadoop/hbase/client/ResultScanner;]
Table.getScanner ( Scan p1 ) [abstract] : ResultScanner
[mangled: org/apache/hadoop/hbase/client/Table.getScanner:(Lorg/apache/hadoop/hbase/client/Scan;)Lorg/apache/hadoop/hbase/client/ResultScanner;]
Table.getTableDescriptor ( ) [abstract] : org.apache.hadoop.hbase.HTableDescriptor
[mangled: org/apache/hadoop/hbase/client/Table.getTableDescriptor:()Lorg/apache/hadoop/hbase/HTableDescriptor;]
Table.getWriteBufferSize ( ) [abstract] : long
[mangled: org/apache/hadoop/hbase/client/Table.getWriteBufferSize:()J]
Table.increment ( Increment p1 ) [abstract] : Result
[mangled: org/apache/hadoop/hbase/client/Table.increment:(Lorg/apache/hadoop/hbase/client/Increment;)Lorg/apache/hadoop/hbase/client/Result;]
Table.incrementColumnValue ( byte[ ] p1, byte[ ] p2, byte[ ] p3, long p4 ) [abstract] : long
[mangled: org/apache/hadoop/hbase/client/Table.incrementColumnValue:([B[B[BJ)J]
Table.incrementColumnValue ( byte[ ] p1, byte[ ] p2, byte[ ] p3, long p4, Durability p5 ) [abstract] : long
[mangled: org/apache/hadoop/hbase/client/Table.incrementColumnValue:([B[B[BJLorg/apache/hadoop/hbase/client/Durability;)J]
Table.isAutoFlush ( ) [abstract] : boolean
[mangled: org/apache/hadoop/hbase/client/Table.isAutoFlush:()Z]
Table.mutateRow ( RowMutations p1 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.mutateRow:(Lorg/apache/hadoop/hbase/client/RowMutations;)V]
Table.put ( java.util.List<Put> p1 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.put:(Ljava/util/List;)V]
Table.put ( Put p1 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.put:(Lorg/apache/hadoop/hbase/client/Put;)V]
Table.setAutoFlushTo ( boolean p1 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.setAutoFlushTo:(Z)V]
Table.setWriteBufferSize ( long p1 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.setWriteBufferSize:(J)V]
hbase-client-1.0.0.jar,
TokenUtil.class
package org.apache.hadoop.hbase.security.token
TokenUtil.addTokenForJob ( org.apache.hadoop.hbase.client.Connection conn, org.apache.hadoop.hbase.security.User user, org.apache.hadoop.mapreduce.Job job ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.addTokenForJob:(Lorg/apache/hadoop/hbase/client/Connection;Lorg/apache/hadoop/hbase/security/User;Lorg/apache/hadoop/mapreduce/Job;)V]
TokenUtil.addTokenForJob ( org.apache.hadoop.hbase.client.Connection conn, org.apache.hadoop.mapred.JobConf job, org.apache.hadoop.hbase.security.User user ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.addTokenForJob:(Lorg/apache/hadoop/hbase/client/Connection;Lorg/apache/hadoop/mapred/JobConf;Lorg/apache/hadoop/hbase/security/User;)V]
TokenUtil.addTokenIfMissing ( org.apache.hadoop.hbase.client.Connection conn, org.apache.hadoop.hbase.security.User user ) [static] : boolean
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.addTokenIfMissing:(Lorg/apache/hadoop/hbase/client/Connection;Lorg/apache/hadoop/hbase/security/User;)Z]
TokenUtil.obtainAndCacheToken ( org.apache.hadoop.hbase.client.Connection conn, org.apache.hadoop.hbase.security.User user ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainAndCacheToken:(Lorg/apache/hadoop/hbase/client/Connection;Lorg/apache/hadoop/hbase/security/User;)V]
TokenUtil.obtainToken ( org.apache.hadoop.hbase.client.Connection conn ) [static] : org.apache.hadoop.security.token.Token<AuthenticationTokenIdentifier>
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainToken:(Lorg/apache/hadoop/hbase/client/Connection;)Lorg/apache/hadoop/security/token/Token;]
TokenUtil.obtainToken ( org.apache.hadoop.hbase.client.Connection conn, org.apache.hadoop.hbase.security.User user ) [static] : org.apache.hadoop.security.token.Token<AuthenticationTokenIdentifier>
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainToken:(Lorg/apache/hadoop/hbase/client/Connection;Lorg/apache/hadoop/hbase/security/User;)Lorg/apache/hadoop/security/token/Token;]
TokenUtil.obtainTokenForJob ( org.apache.hadoop.hbase.client.Connection conn, org.apache.hadoop.hbase.security.User user, org.apache.hadoop.mapreduce.Job job ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainTokenForJob:(Lorg/apache/hadoop/hbase/client/Connection;Lorg/apache/hadoop/hbase/security/User;Lorg/apache/hadoop/mapreduce/Job;)V]
TokenUtil.obtainTokenForJob ( org.apache.hadoop.hbase.client.Connection conn, org.apache.hadoop.mapred.JobConf job, org.apache.hadoop.hbase.security.User user ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainTokenForJob:(Lorg/apache/hadoop/hbase/client/Connection;Lorg/apache/hadoop/mapred/JobConf;Lorg/apache/hadoop/hbase/security/User;)V]
hbase-common-1.0.0.jar,
ByteBufferUtils.class
package org.apache.hadoop.hbase.util
ByteBufferUtils.compareTo ( java.nio.ByteBuffer buf1, int o1, int len1, java.nio.ByteBuffer buf2, int o2, int len2 ) [static] : int
[mangled: org/apache/hadoop/hbase/util/ByteBufferUtils.compareTo:(Ljava/nio/ByteBuffer;IILjava/nio/ByteBuffer;II)I]
ByteBufferUtils.copyFromBufferToBuffer ( java.nio.ByteBuffer out, java.nio.ByteBuffer in, int sourceOffset, int destinationOffset, int length ) [static] : void
[mangled: org/apache/hadoop/hbase/util/ByteBufferUtils.copyFromBufferToBuffer:(Ljava/nio/ByteBuffer;Ljava/nio/ByteBuffer;III)V]
ByteBufferUtils.toBytes ( java.nio.ByteBuffer buffer, int offset, int length ) [static] : byte[ ]
[mangled: org/apache/hadoop/hbase/util/ByteBufferUtils.toBytes:(Ljava/nio/ByteBuffer;II)[B]
hbase-common-1.0.0.jar,
CellUtil.class
package org.apache.hadoop.hbase
CellUtil.createCell ( byte[ ] row ) [static] : Cell
[mangled: org/apache/hadoop/hbase/CellUtil.createCell:([B)Lorg/apache/hadoop/hbase/Cell;]
CellUtil.createCell ( byte[ ] row, byte[ ] family, byte[ ] qualifier ) [static] : Cell
[mangled: org/apache/hadoop/hbase/CellUtil.createCell:([B[B[B)Lorg/apache/hadoop/hbase/Cell;]
CellUtil.createCell ( byte[ ] row, byte[ ] value ) [static] : Cell
[mangled: org/apache/hadoop/hbase/CellUtil.createCell:([B[B)Lorg/apache/hadoop/hbase/Cell;]
CellUtil.createCell ( byte[ ] rowArray, int rowOffset, int rowLength, byte[ ] familyArray, int familyOffset, int familyLength, byte[ ] qualifierArray, int qualifierOffset, int qualifierLength ) [static] : Cell
[mangled: org/apache/hadoop/hbase/CellUtil.createCell:([BII[BII[BII)Lorg/apache/hadoop/hbase/Cell;]
CellUtil.estimatedHeapSizeOf ( Cell cell ) [static] : long
[mangled: org/apache/hadoop/hbase/CellUtil.estimatedHeapSizeOf:(Lorg/apache/hadoop/hbase/Cell;)J]
CellUtil.estimatedSerializedSizeOf ( Cell cell ) [static] : int
[mangled: org/apache/hadoop/hbase/CellUtil.estimatedSerializedSizeOf:(Lorg/apache/hadoop/hbase/Cell;)I]
CellUtil.estimatedSerializedSizeOfKey ( Cell cell ) [static] : int
[mangled: org/apache/hadoop/hbase/CellUtil.estimatedSerializedSizeOfKey:(Lorg/apache/hadoop/hbase/Cell;)I]
CellUtil.findCommonPrefixInFlatKey ( Cell c1, Cell c2, boolean bypassFamilyCheck, boolean withTsType ) [static] : int
[mangled: org/apache/hadoop/hbase/CellUtil.findCommonPrefixInFlatKey:(Lorg/apache/hadoop/hbase/Cell;Lorg/apache/hadoop/hbase/Cell;ZZ)I]
CellUtil.getCellKeyAsString ( Cell cell ) [static] : String
[mangled: org/apache/hadoop/hbase/CellUtil.getCellKeyAsString:(Lorg/apache/hadoop/hbase/Cell;)Ljava/lang/String;]
CellUtil.getCellKeySerializedAsKeyValueKey ( Cell cell ) [static] : byte[ ]
[mangled: org/apache/hadoop/hbase/CellUtil.getCellKeySerializedAsKeyValueKey:(Lorg/apache/hadoop/hbase/Cell;)[B]
CellUtil.isDelete ( byte type ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.isDelete:(B)Z]
CellUtil.isDeleteColumnOrFamily ( Cell cell ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.isDeleteColumnOrFamily:(Lorg/apache/hadoop/hbase/Cell;)Z]
CellUtil.isDeleteFamilyVersion ( Cell cell ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.isDeleteFamilyVersion:(Lorg/apache/hadoop/hbase/Cell;)Z]
CellUtil.isDeleteType ( Cell cell ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.isDeleteType:(Lorg/apache/hadoop/hbase/Cell;)Z]
CellUtil.matchingColumn ( Cell left, byte[ ] fam, int foffset, int flength, byte[ ] qual, int qoffset, int qlength ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.matchingColumn:(Lorg/apache/hadoop/hbase/Cell;[BII[BII)Z]
CellUtil.matchingFamily ( Cell left, byte[ ] buf, int offset, int length ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.matchingFamily:(Lorg/apache/hadoop/hbase/Cell;[BII)Z]
CellUtil.matchingQualifier ( Cell left, byte[ ] buf, int offset, int length ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.matchingQualifier:(Lorg/apache/hadoop/hbase/Cell;[BII)Z]
CellUtil.matchingRow ( Cell left, byte[ ] buf, int offset, int length ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.matchingRow:(Lorg/apache/hadoop/hbase/Cell;[BII)Z]
CellUtil.setSequenceId ( Cell cell, long seqId ) [static] : void
[mangled: org/apache/hadoop/hbase/CellUtil.setSequenceId:(Lorg/apache/hadoop/hbase/Cell;J)V]
CellUtil.setTimestamp ( Cell cell, byte[ ] ts, int tsOffset ) [static] : void
[mangled: org/apache/hadoop/hbase/CellUtil.setTimestamp:(Lorg/apache/hadoop/hbase/Cell;[BI)V]
CellUtil.setTimestamp ( Cell cell, long ts ) [static] : void
[mangled: org/apache/hadoop/hbase/CellUtil.setTimestamp:(Lorg/apache/hadoop/hbase/Cell;J)V]
CellUtil.updateLatestStamp ( Cell cell, byte[ ] ts, int tsOffset ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.updateLatestStamp:(Lorg/apache/hadoop/hbase/Cell;[BI)Z]
CellUtil.updateLatestStamp ( Cell cell, long ts ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.updateLatestStamp:(Lorg/apache/hadoop/hbase/Cell;J)Z]
CellUtil.writeFlatKey ( Cell cell, java.io.DataOutputStream out ) [static] : void
[mangled: org/apache/hadoop/hbase/CellUtil.writeFlatKey:(Lorg/apache/hadoop/hbase/Cell;Ljava/io/DataOutputStream;)V]
CellUtil.writeRowKeyExcludingCommon ( Cell cell, short rLen, int commonPrefix, java.io.DataOutputStream out ) [static] : void
[mangled: org/apache/hadoop/hbase/CellUtil.writeRowKeyExcludingCommon:(Lorg/apache/hadoop/hbase/Cell;SILjava/io/DataOutputStream;)V]
hbase-common-1.0.0.jar,
Counter.class
package org.apache.hadoop.hbase.util
Counter.add ( long delta ) : void
[mangled: org/apache/hadoop/hbase/util/Counter.add:(J)V]
Counter.Counter ( )
[mangled: org/apache/hadoop/hbase/util/Counter."<init>":()V]
Counter.Counter ( long initValue )
[mangled: org/apache/hadoop/hbase/util/Counter."<init>":(J)V]
Counter.decrement ( ) : void
[mangled: org/apache/hadoop/hbase/util/Counter.decrement:()V]
Counter.get ( ) : long
[mangled: org/apache/hadoop/hbase/util/Counter.get:()J]
Counter.increment ( ) : void
[mangled: org/apache/hadoop/hbase/util/Counter.increment:()V]
Counter.set ( long value ) : void
[mangled: org/apache/hadoop/hbase/util/Counter.set:(J)V]
Counter.toString ( ) : String
[mangled: org/apache/hadoop/hbase/util/Counter.toString:()Ljava/lang/String;]
hbase-server-1.0.0.jar,
HFileOutputFormat2.class
package org.apache.hadoop.hbase.mapreduce
HFileOutputFormat2.configureIncrementalLoad ( org.apache.hadoop.mapreduce.Job job, org.apache.hadoop.hbase.client.Table table, org.apache.hadoop.hbase.client.RegionLocator regionLocator ) [static] : void
[mangled: org/apache/hadoop/hbase/mapreduce/HFileOutputFormat2.configureIncrementalLoad:(Lorg/apache/hadoop/mapreduce/Job;Lorg/apache/hadoop/hbase/client/Table;Lorg/apache/hadoop/hbase/client/RegionLocator;)V]
HFileOutputFormat2.configureIncrementalLoadMap ( org.apache.hadoop.mapreduce.Job job, org.apache.hadoop.hbase.client.Table table ) [static] : void
[mangled: org/apache/hadoop/hbase/mapreduce/HFileOutputFormat2.configureIncrementalLoadMap:(Lorg/apache/hadoop/mapreduce/Job;Lorg/apache/hadoop/hbase/client/Table;)V]
hbase-server-1.0.0.jar,
RowTooBigException.class
package org.apache.hadoop.hbase.regionserver
RowTooBigException.RowTooBigException ( String message )
[mangled: org/apache/hadoop/hbase/regionserver/RowTooBigException."<init>":(Ljava/lang/String;)V]
hbase-server-1.0.0.jar,
TableInputFormat.class
package org.apache.hadoop.hbase.mapreduce
TableInputFormat.getSplits ( org.apache.hadoop.mapreduce.JobContext context ) : java.util.List<org.apache.hadoop.mapreduce.InputSplit>
[mangled: org/apache/hadoop/hbase/mapreduce/TableInputFormat.getSplits:(Lorg/apache/hadoop/mapreduce/JobContext;)Ljava/util/List;]
TableInputFormat.initialize ( ) : void
[mangled: org/apache/hadoop/hbase/mapreduce/TableInputFormat.initialize:()V]
hbase-server-1.0.0.jar,
TableInputFormatBase.class
package org.apache.hadoop.hbase.mapreduce
TableInputFormatBase.closeTable ( ) : void
[mangled: org/apache/hadoop/hbase/mapreduce/TableInputFormatBase.closeTable:()V]
TableInputFormatBase.getAdmin ( ) : org.apache.hadoop.hbase.client.Admin
[mangled: org/apache/hadoop/hbase/mapreduce/TableInputFormatBase.getAdmin:()Lorg/apache/hadoop/hbase/client/Admin;]
TableInputFormatBase.getRegionLocator ( ) : org.apache.hadoop.hbase.client.RegionLocator
[mangled: org/apache/hadoop/hbase/mapreduce/TableInputFormatBase.getRegionLocator:()Lorg/apache/hadoop/hbase/client/RegionLocator;]
TableInputFormatBase.getTable ( ) : org.apache.hadoop.hbase.client.Table
[mangled: org/apache/hadoop/hbase/mapreduce/TableInputFormatBase.getTable:()Lorg/apache/hadoop/hbase/client/Table;]
TableInputFormatBase.initialize ( ) : void
[mangled: org/apache/hadoop/hbase/mapreduce/TableInputFormatBase.initialize:()V]
TableInputFormatBase.initializeTable ( org.apache.hadoop.hbase.client.Connection connection, org.apache.hadoop.hbase.TableName tableName ) : void
[mangled: org/apache/hadoop/hbase/mapreduce/TableInputFormatBase.initializeTable:(Lorg/apache/hadoop/hbase/client/Connection;Lorg/apache/hadoop/hbase/TableName;)V]
hbase-server-1.0.0.jar,
TableRecordReader.class
package org.apache.hadoop.hbase.mapreduce
TableRecordReader.setHTable ( org.apache.hadoop.hbase.client.Table htable ) : void *DEPRECATED*
[mangled: org/apache/hadoop/hbase/mapreduce/TableRecordReader.setHTable:(Lorg/apache/hadoop/hbase/client/Table;)V]
TableRecordReader.setTable ( org.apache.hadoop.hbase.client.Table table ) : void
[mangled: org/apache/hadoop/hbase/mapreduce/TableRecordReader.setTable:(Lorg/apache/hadoop/hbase/client/Table;)V]
to the top
Removed Methods (35)
hbase-client-0.98.9.jar,
Attributes.class
package org.apache.hadoop.hbase.client
Attributes.setAttribute ( String p1, byte[ ] p2 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Attributes.setAttribute:(Ljava/lang/String;[B)V]
hbase-client-0.98.9.jar,
ClientScanner.class
package org.apache.hadoop.hbase.client
ClientScanner.ClientScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, byte[ ] tableName ) *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/ClientScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;[B)V]
ClientScanner.ClientScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, byte[ ] tableName, HConnection connection ) *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/ClientScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;[BLorg/apache/hadoop/hbase/client/HConnection;)V]
ClientScanner.ClientScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, org.apache.hadoop.hbase.TableName tableName ) *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/ClientScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/TableName;)V]
ClientScanner.ClientScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, org.apache.hadoop.hbase.TableName tableName, HConnection connection )
[mangled: org/apache/hadoop/hbase/client/ClientScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/HConnection;)V]
ClientScanner.ClientScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, org.apache.hadoop.hbase.TableName tableName, HConnection connection, RpcRetryingCallerFactory rpcFactory ) *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/ClientScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/client/RpcRetryingCallerFactory;)V]
ClientScanner.ClientScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, org.apache.hadoop.hbase.TableName tableName, HConnection connection, RpcRetryingCallerFactory rpcFactory, org.apache.hadoop.hbase.ipc.RpcControllerFactory controllerFactory )
[mangled: org/apache/hadoop/hbase/client/ClientScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/client/RpcRetryingCallerFactory;Lorg/apache/hadoop/hbase/ipc/RpcControllerFactory;)V]
ClientScanner.getConnection ( ) : HConnection
[mangled: org/apache/hadoop/hbase/client/ClientScanner.getConnection:()Lorg/apache/hadoop/hbase/client/HConnection;]
ClientScanner.getScannerCallable ( byte[ ] localStartKey, int nbRows ) : ScannerCallable
[mangled: org/apache/hadoop/hbase/client/ClientScanner.getScannerCallable:([BI)Lorg/apache/hadoop/hbase/client/ScannerCallable;]
hbase-client-0.98.9.jar,
ClientSmallScanner.class
package org.apache.hadoop.hbase.client
ClientSmallScanner.ClientSmallScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, org.apache.hadoop.hbase.TableName tableName )
[mangled: org/apache/hadoop/hbase/client/ClientSmallScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/TableName;)V]
ClientSmallScanner.ClientSmallScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, org.apache.hadoop.hbase.TableName tableName, HConnection connection )
[mangled: org/apache/hadoop/hbase/client/ClientSmallScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/HConnection;)V]
ClientSmallScanner.ClientSmallScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, org.apache.hadoop.hbase.TableName tableName, HConnection connection, RpcRetryingCallerFactory rpcFactory, org.apache.hadoop.hbase.ipc.RpcControllerFactory controllerFactory )
[mangled: org/apache/hadoop/hbase/client/ClientSmallScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/client/RpcRetryingCallerFactory;Lorg/apache/hadoop/hbase/ipc/RpcControllerFactory;)V]
hbase-client-0.98.9.jar,
Filter.class
package org.apache.hadoop.hbase.filter
Filter.filterRow ( java.util.List<org.apache.hadoop.hbase.KeyValue> p1 ) [abstract] : void *DEPRECATED*
[mangled: org/apache/hadoop/hbase/filter/Filter.filterRow:(Ljava/util/List;)V]
hbase-client-0.98.9.jar,
FilterList.class
package org.apache.hadoop.hbase.filter
FilterList.filterRow ( java.util.List<org.apache.hadoop.hbase.KeyValue> kvs ) : void *DEPRECATED*
[mangled: org/apache/hadoop/hbase/filter/FilterList.filterRow:(Ljava/util/List;)V]
hbase-client-0.98.9.jar,
Get.class
package org.apache.hadoop.hbase.client
Get.setCacheBlocks ( boolean cacheBlocks ) : void
[mangled: org/apache/hadoop/hbase/client/Get.setCacheBlocks:(Z)V]
Get.setCheckExistenceOnly ( boolean checkExistenceOnly ) : void
[mangled: org/apache/hadoop/hbase/client/Get.setCheckExistenceOnly:(Z)V]
Get.setClosestRowBefore ( boolean closestRowBefore ) : void
[mangled: org/apache/hadoop/hbase/client/Get.setClosestRowBefore:(Z)V]
hbase-client-0.98.9.jar,
HTable.class
package org.apache.hadoop.hbase.client
HTable.HTable ( byte[ ] tableName, HConnection connection, java.util.concurrent.ExecutorService pool )
[mangled: org/apache/hadoop/hbase/client/HTable."<init>":([BLorg/apache/hadoop/hbase/client/HConnection;Ljava/util/concurrent/ExecutorService;)V]
HTable.HTable ( org.apache.hadoop.hbase.TableName tableName, HConnection connection )
[mangled: org/apache/hadoop/hbase/client/HTable."<init>":(Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/HConnection;)V]
HTable.HTable ( org.apache.hadoop.hbase.TableName tableName, HConnection connection, java.util.concurrent.ExecutorService pool )
[mangled: org/apache/hadoop/hbase/client/HTable."<init>":(Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/HConnection;Ljava/util/concurrent/ExecutorService;)V]
HTable.HTable ( org.apache.hadoop.hbase.TableName tableName, HConnection connection, TableConfiguration tableConfig, RpcRetryingCallerFactory rpcCallerFactory, org.apache.hadoop.hbase.ipc.RpcControllerFactory rpcControllerFactory, java.util.concurrent.ExecutorService pool )
[mangled: org/apache/hadoop/hbase/client/HTable."<init>":(Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/client/TableConfiguration;Lorg/apache/hadoop/hbase/client/RpcRetryingCallerFactory;Lorg/apache/hadoop/hbase/ipc/RpcControllerFactory;Ljava/util/concurrent/ExecutorService;)V]
HTable.main ( String[ ] args ) [static] : void
[mangled: org/apache/hadoop/hbase/client/HTable.main:([Ljava/lang/String;)V]
hbase-client-0.98.9.jar,
ReversedClientScanner.class
package org.apache.hadoop.hbase.client
ReversedClientScanner.getScannerCallable ( byte[ ] localStartKey, int nbRows, byte[ ] locateStartRow ) : ScannerCallable
[mangled: org/apache/hadoop/hbase/client/ReversedClientScanner.getScannerCallable:([BI[B)Lorg/apache/hadoop/hbase/client/ScannerCallable;]
ReversedClientScanner.ReversedClientScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, org.apache.hadoop.hbase.TableName tableName, HConnection connection )
[mangled: org/apache/hadoop/hbase/client/ReversedClientScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/HConnection;)V]
hbase-client-0.98.9.jar,
ReversedScannerCallable.class
package org.apache.hadoop.hbase.client
ReversedScannerCallable.ReversedScannerCallable ( HConnection connection, org.apache.hadoop.hbase.TableName tableName, Scan scan, metrics.ScanMetrics scanMetrics, byte[ ] locateStartRow ) *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/ReversedScannerCallable."<init>":(Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/client/metrics/ScanMetrics;[B)V]
ReversedScannerCallable.ReversedScannerCallable ( HConnection connection, org.apache.hadoop.hbase.TableName tableName, Scan scan, metrics.ScanMetrics scanMetrics, byte[ ] locateStartRow, org.apache.hadoop.hbase.ipc.PayloadCarryingRpcController rpcFactory )
[mangled: org/apache/hadoop/hbase/client/ReversedScannerCallable."<init>":(Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/client/metrics/ScanMetrics;[BLorg/apache/hadoop/hbase/ipc/PayloadCarryingRpcController;)V]
hbase-client-0.98.9.jar,
TokenUtil.class
package org.apache.hadoop.hbase.security.token
TokenUtil.addTokenForJob ( org.apache.hadoop.hbase.client.HConnection conn, org.apache.hadoop.hbase.security.User user, org.apache.hadoop.mapreduce.Job job ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.addTokenForJob:(Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/security/User;Lorg/apache/hadoop/mapreduce/Job;)V]
TokenUtil.addTokenForJob ( org.apache.hadoop.hbase.client.HConnection conn, org.apache.hadoop.mapred.JobConf job, org.apache.hadoop.hbase.security.User user ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.addTokenForJob:(Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/mapred/JobConf;Lorg/apache/hadoop/hbase/security/User;)V]
TokenUtil.addTokenIfMissing ( org.apache.hadoop.hbase.client.HConnection conn, org.apache.hadoop.hbase.security.User user ) [static] : boolean
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.addTokenIfMissing:(Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/security/User;)Z]
TokenUtil.obtainAndCacheToken ( org.apache.hadoop.hbase.client.HConnection conn, org.apache.hadoop.hbase.security.User user ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainAndCacheToken:(Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/security/User;)V]
TokenUtil.obtainToken ( org.apache.hadoop.hbase.client.HConnection conn ) [static] : org.apache.hadoop.security.token.Token<AuthenticationTokenIdentifier>
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainToken:(Lorg/apache/hadoop/hbase/client/HConnection;)Lorg/apache/hadoop/security/token/Token;]
TokenUtil.obtainToken ( org.apache.hadoop.hbase.client.HConnection conn, org.apache.hadoop.hbase.security.User user ) [static] : org.apache.hadoop.security.token.Token<AuthenticationTokenIdentifier>
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainToken:(Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/security/User;)Lorg/apache/hadoop/security/token/Token;]
TokenUtil.obtainTokenForJob ( org.apache.hadoop.hbase.client.HConnection conn, org.apache.hadoop.hbase.security.User user, org.apache.hadoop.mapreduce.Job job ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainTokenForJob:(Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/security/User;Lorg/apache/hadoop/mapreduce/Job;)V]
TokenUtil.obtainTokenForJob ( org.apache.hadoop.hbase.client.HConnection conn, org.apache.hadoop.mapred.JobConf job, org.apache.hadoop.hbase.security.User user ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainTokenForJob:(Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/mapred/JobConf;Lorg/apache/hadoop/hbase/security/User;)V]
hbase-server-0.98.9.jar,
HFileOutputFormat2.class
package org.apache.hadoop.hbase.mapreduce
HFileOutputFormat2.configureIncrementalLoadMap ( org.apache.hadoop.mapreduce.Job job, org.apache.hadoop.hbase.client.HTable table ) [static] : void
[mangled: org/apache/hadoop/hbase/mapreduce/HFileOutputFormat2.configureIncrementalLoadMap:(Lorg/apache/hadoop/mapreduce/Job;Lorg/apache/hadoop/hbase/client/HTable;)V]
to the top
Problems with Data Types, High Severity (21)
hbase-client-0.98.9.jar
package org.apache.hadoop.hbase.client
[+] Attributes (1)
| Change | Effect |
|---|
| 1 | Abstract method setAttribute ( java.lang.String, byte[ ] ) has been removed from this interface. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (2)
getAttribute ( java.lang.String )This abstract method is from 'Attributes' interface.
getAttributesMap ( )This abstract method is from 'Attributes' interface.
[+] ClientScanner (1)
| Change | Effect |
|---|
| 1 | Type of field callable has been changed from ScannerCallable to ScannerCallableWithReplicas. | A client program may be interrupted by NoSuchFieldError exception. |
[+] affected methods (10)
checkScanStopRow ( byte[ ] )This method is from 'ClientScanner' class.
close ( )This method is from 'ClientScanner' class.
getScan ( )This method is from 'ClientScanner' class.
getTable ( )This method is from 'ClientScanner' class.
getTableName ( )This method is from 'ClientScanner' class.
getTimestamp ( )This method is from 'ClientScanner' class.
initializeScannerInConstruction ( )This method is from 'ClientScanner' class.
next ( )This method is from 'ClientScanner' class.
nextScanner ( int, boolean )This method is from 'ClientScanner' class.
writeScanMetrics ( )This method is from 'ClientScanner' class.
[+] HConnection (2)
| Change | Effect |
|---|
| 1 | Removed super-interface java.io.Closeable. | A client program may be interrupted by NoSuchMethodError exception. |
| 2 | Removed super-interface org.apache.hadoop.hbase.Abortable. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (59)
clearCaches ( org.apache.hadoop.hbase.ServerName )This abstract method is from 'HConnection' interface.
clearRegionCache ( )This abstract method is from 'HConnection' interface.
clearRegionCache ( byte[ ] )This abstract method is from 'HConnection' interface.
clearRegionCache ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
deleteCachedRegionLocation ( org.apache.hadoop.hbase.HRegionLocation )This abstract method is from 'HConnection' interface.
getAdmin ( org.apache.hadoop.hbase.ServerName )This abstract method is from 'HConnection' interface.
getAdmin ( org.apache.hadoop.hbase.ServerName, boolean )This abstract method is from 'HConnection' interface.
getClient ( org.apache.hadoop.hbase.ServerName )This abstract method is from 'HConnection' interface.
getConfiguration ( )This abstract method is from 'HConnection' interface.
getCurrentNrHRS ( )This abstract method is from 'HConnection' interface.
getHTableDescriptor ( byte[ ] )This abstract method is from 'HConnection' interface.
getHTableDescriptor ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
getHTableDescriptors ( java.util.List<java.lang.String> )This abstract method is from 'HConnection' interface.
getHTableDescriptorsByTableName ( java.util.List<org.apache.hadoop.hbase.TableName> )This abstract method is from 'HConnection' interface.
getKeepAliveMasterService ( )This abstract method is from 'HConnection' interface.
getMaster ( )This abstract method is from 'HConnection' interface.
getNonceGenerator ( )This abstract method is from 'HConnection' interface.
getRegionCachePrefetch ( byte[ ] )This abstract method is from 'HConnection' interface.
getRegionCachePrefetch ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
getRegionLocation ( byte[ ], byte[ ], boolean )This abstract method is from 'HConnection' interface.
getRegionLocation ( org.apache.hadoop.hbase.TableName, byte[ ], boolean )This abstract method is from 'HConnection' interface.
getTable ( byte[ ] )This abstract method is from 'HConnection' interface.
getTable ( byte[ ], java.util.concurrent.ExecutorService )This abstract method is from 'HConnection' interface.
getTable ( java.lang.String )This abstract method is from 'HConnection' interface.
getTable ( java.lang.String, java.util.concurrent.ExecutorService )This abstract method is from 'HConnection' interface.
getTable ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
getTable ( org.apache.hadoop.hbase.TableName, java.util.concurrent.ExecutorService )This abstract method is from 'HConnection' interface.
getTableNames ( )This abstract method is from 'HConnection' interface.
isClosed ( )This abstract method is from 'HConnection' interface.
isDeadServer ( org.apache.hadoop.hbase.ServerName )This abstract method is from 'HConnection' interface.
isMasterRunning ( )This abstract method is from 'HConnection' interface.
isTableAvailable ( byte[ ] )This abstract method is from 'HConnection' interface.
isTableAvailable ( byte[ ], byte[ ][ ] )This abstract method is from 'HConnection' interface.
isTableAvailable ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
isTableAvailable ( org.apache.hadoop.hbase.TableName, byte[ ][ ] )This abstract method is from 'HConnection' interface.
isTableDisabled ( byte[ ] )This abstract method is from 'HConnection' interface.
isTableDisabled ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
isTableEnabled ( byte[ ] )This abstract method is from 'HConnection' interface.
isTableEnabled ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
listTableNames ( )This abstract method is from 'HConnection' interface.
listTables ( )This abstract method is from 'HConnection' interface.
locateRegion ( byte[ ] )This abstract method is from 'HConnection' interface.
locateRegion ( byte[ ], byte[ ] )This abstract method is from 'HConnection' interface.
locateRegion ( org.apache.hadoop.hbase.TableName, byte[ ] )This abstract method is from 'HConnection' interface.
locateRegions ( byte[ ] )This abstract method is from 'HConnection' interface.
locateRegions ( byte[ ], boolean, boolean )This abstract method is from 'HConnection' interface.
locateRegions ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
locateRegions ( org.apache.hadoop.hbase.TableName, boolean, boolean )This abstract method is from 'HConnection' interface.
processBatch ( java.util.List<? extends Row>, byte[ ], java.util.concurrent.ExecutorService, java.lang.Object[ ] )This abstract method is from 'HConnection' interface.
processBatch ( java.util.List<? extends Row>, org.apache.hadoop.hbase.TableName, java.util.concurrent.ExecutorService, java.lang.Object[ ] )This abstract method is from 'HConnection' interface.
processBatchCallback ( java.util.List<? extends Row>, byte[ ], java.util.concurrent.ExecutorService, java.lang.Object[ ], coprocessor.Batch.Callback<R> )This abstract method is from 'HConnection' interface.
processBatchCallback ( java.util.List<? extends Row>, org.apache.hadoop.hbase.TableName, java.util.concurrent.ExecutorService, java.lang.Object[ ], coprocessor.Batch.Callback<R> )This abstract method is from 'HConnection' interface.
relocateRegion ( byte[ ], byte[ ] )This abstract method is from 'HConnection' interface.
relocateRegion ( org.apache.hadoop.hbase.TableName, byte[ ] )This abstract method is from 'HConnection' interface.
setRegionCachePrefetch ( byte[ ], boolean )This abstract method is from 'HConnection' interface.
setRegionCachePrefetch ( org.apache.hadoop.hbase.TableName, boolean )This abstract method is from 'HConnection' interface.
updateCachedLocations ( byte[ ], byte[ ], java.lang.Object, org.apache.hadoop.hbase.HRegionLocation )This abstract method is from 'HConnection' interface.
updateCachedLocations ( org.apache.hadoop.hbase.TableName, byte[ ], java.lang.Object, org.apache.hadoop.hbase.HRegionLocation )This abstract method is from 'HConnection' interface.
getConnection ( )Return value of this method has type 'HConnection'.
[+] HTable (2)
| Change | Effect |
|---|
| 1 | Type of field ap has been changed from AsyncProcess<java.lang.Object> to AsyncProcess. | A client program may be interrupted by NoSuchFieldError exception. |
| 2 | Type of field connection has been changed from HConnection to ClusterConnection. | A client program may be interrupted by NoSuchFieldError exception. |
[+] affected methods (86)
append ( Append )This method is from 'HTable' class.
batch ( java.util.List<? extends Row> )This method is from 'HTable' class.
batch ( java.util.List<? extends Row>, java.lang.Object[ ] )This method is from 'HTable' class.
batchCallback ( java.util.List<? extends Row>, java.lang.Object[ ], coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
batchCallback ( java.util.List<? extends Row>, coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
batchCoprocessorService ( com.google.protobuf.Descriptors.MethodDescriptor, com.google.protobuf.Message, byte[ ], byte[ ], R )This method is from 'HTable' class.
batchCoprocessorService ( com.google.protobuf.Descriptors.MethodDescriptor, com.google.protobuf.Message, byte[ ], byte[ ], R, coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
checkAndDelete ( byte[ ], byte[ ], byte[ ], byte[ ], Delete )This method is from 'HTable' class.
checkAndMutate ( byte[ ], byte[ ], byte[ ], org.apache.hadoop.hbase.filter.CompareFilter.CompareOp, byte[ ], RowMutations )This method is from 'HTable' class.
checkAndPut ( byte[ ], byte[ ], byte[ ], byte[ ], Put )This method is from 'HTable' class.
clearRegionCache ( )This method is from 'HTable' class.
close ( )This method is from 'HTable' class.
coprocessorService ( byte[ ] )This method is from 'HTable' class.
coprocessorService ( java.lang.Class<T>, byte[ ], byte[ ], coprocessor.Batch.Call<T,R> )This method is from 'HTable' class.
coprocessorService ( java.lang.Class<T>, byte[ ], byte[ ], coprocessor.Batch.Call<T,R>, coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
delete ( java.util.List<Delete> )This method is from 'HTable' class.
delete ( Delete )This method is from 'HTable' class.
exists ( java.util.List<Get> )This method is from 'HTable' class.
exists ( Get )This method is from 'HTable' class.
flushCommits ( )This method is from 'HTable' class.
get ( java.util.List<Get> )This method is from 'HTable' class.
get ( Get )This method is from 'HTable' class.
getConfiguration ( )This method is from 'HTable' class.
getConnection ( )This method is from 'HTable' class.
getDefaultExecutor ( org.apache.hadoop.conf.Configuration )This method is from 'HTable' class.
getEndKeys ( )This method is from 'HTable' class.
getMaxKeyValueSize ( org.apache.hadoop.conf.Configuration )This method is from 'HTable' class.
getName ( )This method is from 'HTable' class.
getOperationTimeout ( )This method is from 'HTable' class.
getRegionCachePrefetch ( byte[ ] )This method is from 'HTable' class.
getRegionCachePrefetch ( org.apache.hadoop.conf.Configuration, byte[ ] )This method is from 'HTable' class.
getRegionCachePrefetch ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName )This method is from 'HTable' class.
getRegionCachePrefetch ( org.apache.hadoop.hbase.TableName )This method is from 'HTable' class.
getRegionLocation ( byte[ ] )This method is from 'HTable' class.
getRegionLocation ( byte[ ], boolean )This method is from 'HTable' class.
getRegionLocation ( java.lang.String )This method is from 'HTable' class.
getRegionLocations ( )This method is from 'HTable' class.
getRegionsInRange ( byte[ ], byte[ ] )This method is from 'HTable' class.
getRegionsInRange ( byte[ ], byte[ ], boolean )This method is from 'HTable' class.
getRowOrBefore ( byte[ ], byte[ ] )This method is from 'HTable' class.
getScanner ( byte[ ] )This method is from 'HTable' class.
getScanner ( byte[ ], byte[ ] )This method is from 'HTable' class.
getScanner ( Scan )This method is from 'HTable' class.
getScannerCaching ( )This method is from 'HTable' class.
getStartEndKeys ( )This method is from 'HTable' class.
getStartKeys ( )This method is from 'HTable' class.
getTableDescriptor ( )This method is from 'HTable' class.
getTableName ( )This method is from 'HTable' class.
getWriteBuffer ( )This method is from 'HTable' class.
getWriteBufferSize ( )This method is from 'HTable' class.
HTable ( )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, byte[ ] )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, byte[ ], java.util.concurrent.ExecutorService )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, java.lang.String )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName, java.util.concurrent.ExecutorService )This constructor is from 'HTable' class.
increment ( Increment )This method is from 'HTable' class.
incrementColumnValue ( byte[ ], byte[ ], byte[ ], long )This method is from 'HTable' class.
incrementColumnValue ( byte[ ], byte[ ], byte[ ], long, boolean )This method is from 'HTable' class.
incrementColumnValue ( byte[ ], byte[ ], byte[ ], long, Durability )This method is from 'HTable' class.
isAutoFlush ( )This method is from 'HTable' class.
isTableEnabled ( byte[ ] )This method is from 'HTable' class.
isTableEnabled ( java.lang.String )This method is from 'HTable' class.
isTableEnabled ( org.apache.hadoop.conf.Configuration, byte[ ] )This method is from 'HTable' class.
isTableEnabled ( org.apache.hadoop.conf.Configuration, java.lang.String )This method is from 'HTable' class.
isTableEnabled ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName )This method is from 'HTable' class.
isTableEnabled ( org.apache.hadoop.hbase.TableName )This method is from 'HTable' class.
mutateRow ( RowMutations )This method is from 'HTable' class.
processBatch ( java.util.List<? extends Row>, java.lang.Object[ ] )This method is from 'HTable' class.
processBatchCallback ( java.util.List<? extends Row>, java.lang.Object[ ], coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
put ( java.util.List<Put> )This method is from 'HTable' class.
put ( Put )This method is from 'HTable' class.
setAutoFlush ( boolean )This method is from 'HTable' class.
setAutoFlush ( boolean, boolean )This method is from 'HTable' class.
setAutoFlushTo ( boolean )This method is from 'HTable' class.
setOperationTimeout ( int )This method is from 'HTable' class.
setRegionCachePrefetch ( byte[ ], boolean )This method is from 'HTable' class.
setRegionCachePrefetch ( org.apache.hadoop.conf.Configuration, byte[ ], boolean )This method is from 'HTable' class.
setRegionCachePrefetch ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName, boolean )This method is from 'HTable' class.
setRegionCachePrefetch ( org.apache.hadoop.hbase.TableName, boolean )This method is from 'HTable' class.
setScannerCaching ( int )This method is from 'HTable' class.
setWriteBufferSize ( long )This method is from 'HTable' class.
toString ( )This method is from 'HTable' class.
validatePut ( Put )This method is from 'HTable' class.
validatePut ( Put, int )This method is from 'HTable' class.
configureIncrementalLoad ( org.apache.hadoop.mapreduce.Job, HTable )2nd parameter 'table' of this method has type 'HTable'.
[+] HTableInterface (1)
| Change | Effect |
|---|
| 1 | Removed super-interface java.io.Closeable. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (10)
getTable ( byte[ ] )Return value of this abstract method has type 'HTableInterface'.
getTable ( byte[ ], java.util.concurrent.ExecutorService )Return value of this abstract method has type 'HTableInterface'.
getTable ( java.lang.String )Return value of this abstract method has type 'HTableInterface'.
getTable ( java.lang.String, java.util.concurrent.ExecutorService )Return value of this abstract method has type 'HTableInterface'.
getTable ( org.apache.hadoop.hbase.TableName )Return value of this abstract method has type 'HTableInterface'.
getTable ( org.apache.hadoop.hbase.TableName, java.util.concurrent.ExecutorService )Return value of this abstract method has type 'HTableInterface'.
createHTableInterface ( org.apache.hadoop.conf.Configuration, byte[ ] )Return value of this method has type 'HTableInterface'.
releaseHTableInterface ( HTableInterface )1st parameter 'table' of this method has type 'HTableInterface'.
createHTableInterface ( org.apache.hadoop.conf.Configuration, byte[ ] )Return value of this abstract method has type 'HTableInterface'.
releaseHTableInterface ( HTableInterface )1st parameter 'p1' of this abstract method has type 'HTableInterface'.
package org.apache.hadoop.hbase.filter
[+] Filter (1)
| Change | Effect |
|---|
| 1 | Abstract method filterRow ( java.util.List<org.apache.hadoop.hbase.KeyValue> ) has been removed from this class. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (30)
setFilter ( Filter )1st parameter 'filter' of this method has type 'Filter'.
setFilter ( Filter )Field 'retval.filter' in return value of this method has type 'Filter'.
Filter ( )This constructor is from 'Filter' abstract class.
filterAllRemaining ( )This abstract method is from 'Filter' abstract class.
filterKeyValue ( org.apache.hadoop.hbase.Cell )This abstract method is from 'Filter' abstract class.
filterRow ( )This abstract method is from 'Filter' abstract class.
filterRowCells ( java.util.List<org.apache.hadoop.hbase.Cell> )This abstract method is from 'Filter' abstract class.
filterRowKey ( byte[ ], int, int )This abstract method is from 'Filter' abstract class.
getNextCellHint ( org.apache.hadoop.hbase.Cell )This abstract method is from 'Filter' abstract class.
getNextKeyHint ( org.apache.hadoop.hbase.KeyValue )This abstract method is from 'Filter' abstract class.
hasFilterRow ( )This abstract method is from 'Filter' abstract class.
isFamilyEssential ( byte[ ] )This abstract method is from 'Filter' abstract class.
isReversed ( )This method is from 'Filter' abstract class.
parseFrom ( byte[ ] )This method is from 'Filter' abstract class.
reset ( )This abstract method is from 'Filter' abstract class.
setReversed ( boolean )This method is from 'Filter' abstract class.
toByteArray ( )This abstract method is from 'Filter' abstract class.
transform ( org.apache.hadoop.hbase.KeyValue )This abstract method is from 'Filter' abstract class.
transformCell ( org.apache.hadoop.hbase.Cell )This abstract method is from 'Filter' abstract class.
addFilter ( Filter )1st parameter 'filter' of this method has type 'Filter'.
parseFilterString ( byte[ ] )Return value of this method has type 'Filter'.
parseFilterString ( java.lang.String )Return value of this method has type 'Filter'.
parseSimpleFilterExpression ( byte[ ] )Return value of this method has type 'Filter'.
popArguments ( java.util.Stack<java.nio.ByteBuffer>, java.util.Stack<Filter> )Return value of this method has type 'Filter'.
createFilterFromArguments ( java.util.ArrayList<byte[ ]> )Return value of this method has type 'Filter'.
createFilterFromArguments ( java.util.ArrayList<byte[ ]> )Return value of this method has type 'Filter'.
getFilter ( )Return value of this method has type 'Filter'.
SkipFilter ( Filter )1st parameter 'filter' of this method has type 'Filter'.
filterKv ( Filter, org.apache.hadoop.hbase.Cell )1st parameter 'filter' of this method has type 'Filter'.
instantiateFilter ( org.apache.hadoop.conf.Configuration )Return value of this method has type 'Filter'.
hbase-server-0.98.9.jar
package org.apache.hadoop.hbase.master
[+] HMaster (2)
| Change | Effect |
|---|
| 1 | Removed super-interface org.apache.hadoop.hbase.protobuf.generated.MasterProtos.MasterService.BlockingInterface. | A client program may be interrupted by NoSuchMethodError exception. |
| 2 | Removed super-interface org.apache.hadoop.hbase.protobuf.generated.RegionServerStatusProtos.RegionServerStatusService.BlockingInterface. | A client program may be interrupted by NoSuchMethodError exception. |
[+] affected methods (2)
getActiveMaster ( )Return value of this method has type 'HMaster'.
getMaster ( int )Return value of this method has type 'HMaster'.
package org.apache.hadoop.hbase.regionserver
[+] HRegionServer (11)
| Change | Effect |
|---|
| 1 | Access level of field abortRequested has been changed from protected to private. | A client program may be interrupted by IllegalAccessError exception. |
| 2 | Access level of field stopped has been changed from protected to private. | A client program may be interrupted by IllegalAccessError exception. |
| 3 | Removed super-interface java.lang.Runnable. | A client program may be interrupted by NoSuchMethodError exception. |
| 4 | Removed super-interface org.apache.hadoop.hbase.ipc.HBaseRPCErrorHandler. | A client program may be interrupted by NoSuchMethodError exception. |
| 5 | Removed super-interface org.apache.hadoop.hbase.protobuf.generated.AdminProtos.AdminService.BlockingInterface. | A client program may be interrupted by NoSuchMethodError exception. |
| 6 | Removed super-interface org.apache.hadoop.hbase.protobuf.generated.ClientProtos.ClientService.BlockingInterface. | A client program may be interrupted by NoSuchMethodError exception. |
| 7 | Field catalogTracker of type org.apache.hadoop.hbase.catalog.CatalogTracker has been removed from this class. | A client program may be interrupted by NoSuchFieldError exception. |
| 8 | Field hlog of type wal.HLog has been removed from this class. | A client program may be interrupted by NoSuchFieldError exception. |
| 9 | Field hlogForMeta of type wal.HLog has been removed from this class. | A client program may be interrupted by NoSuchFieldError exception. |
| 10 | Field isOnline of type boolean has been removed from this class. | A client program may be interrupted by NoSuchFieldError exception. |
| 11 | Field maxScannerResultSize of type long has been removed from this class. | A client program may be interrupted by NoSuchFieldError exception. |
[+] affected methods (1)
getRegionServer ( int )Return value of this method has type 'HRegionServer'.
to the top
Problems with Methods, High Severity (12)
hbase-client-0.98.9.jar,
Append
package org.apache.hadoop.hbase.client
[+] Append.setReturnResults ( boolean returnResults ) : void (1)
[mangled: org/apache/hadoop/hbase/client/Append.setReturnResults:(Z)V]
| Change | Effect |
|---|
| 1 | Return value type has been changed from void to org.apache.hadoop.hbase.client.Append.
| This method has been removed because the return type is part of the method signature. |
hbase-client-0.98.9.jar,
Attributes
package org.apache.hadoop.hbase.client
[+] Attributes.setAttribute ( String p1, byte[ ] p2 ) [abstract] : void (1)
[mangled: org/apache/hadoop/hbase/client/Attributes.setAttribute:(Ljava/lang/String;[B)V]
| Change | Effect |
|---|
| 1 | Return value type has been changed from void to org.apache.hadoop.hbase.client.Attributes.
| This method has been removed because the return type is part of the method signature. |
hbase-client-0.98.9.jar,
Scan
package org.apache.hadoop.hbase.client
[+] Scan.setBatch ( int batch ) : void (1)
[mangled: org/apache/hadoop/hbase/client/Scan.setBatch:(I)V]
| Change | Effect |
|---|
| 1 | Return value type has been changed from void to Scan.
| This method has been removed because the return type is part of the method signature. |
[+] Scan.setCacheBlocks ( boolean cacheBlocks ) : void (1)
[mangled: org/apache/hadoop/hbase/client/Scan.setCacheBlocks:(Z)V]
| Change | Effect |
|---|
| 1 | Return value type has been changed from void to Scan.
| This method has been removed because the return type is part of the method signature. |
[+] Scan.setCaching ( int caching ) : void (1)
[mangled: org/apache/hadoop/hbase/client/Scan.setCaching:(I)V]
| Change | Effect |
|---|
| 1 | Return value type has been changed from void to Scan.
| This method has been removed because the return type is part of the method signature. |
[+] Scan.setLoadColumnFamiliesOnDemand ( boolean value ) : void (1)
[mangled: org/apache/hadoop/hbase/client/Scan.setLoadColumnFamiliesOnDemand:(Z)V]
| Change | Effect |
|---|
| 1 | Return value type has been changed from void to Scan.
| This method has been removed because the return type is part of the method signature. |
[+] Scan.setMaxResultSize ( long maxResultSize ) : void (1)
[mangled: org/apache/hadoop/hbase/client/Scan.setMaxResultSize:(J)V]
| Change | Effect |
|---|
| 1 | Return value type has been changed from void to Scan.
| This method has been removed because the return type is part of the method signature. |
[+] Scan.setMaxResultsPerColumnFamily ( int limit ) : void (1)
[mangled: org/apache/hadoop/hbase/client/Scan.setMaxResultsPerColumnFamily:(I)V]
| Change | Effect |
|---|
| 1 | Return value type has been changed from void to Scan.
| This method has been removed because the return type is part of the method signature. |
[+] Scan.setRaw ( boolean raw ) : void (1)
[mangled: org/apache/hadoop/hbase/client/Scan.setRaw:(Z)V]
| Change | Effect |
|---|
| 1 | Return value type has been changed from void to Scan.
| This method has been removed because the return type is part of the method signature. |
[+] Scan.setRowOffsetPerColumnFamily ( int offset ) : void (1)
[mangled: org/apache/hadoop/hbase/client/Scan.setRowOffsetPerColumnFamily:(I)V]
| Change | Effect |
|---|
| 1 | Return value type has been changed from void to Scan.
| This method has been removed because the return type is part of the method signature. |
[+] Scan.setSmall ( boolean small ) : void (1)
[mangled: org/apache/hadoop/hbase/client/Scan.setSmall:(Z)V]
| Change | Effect |
|---|
| 1 | Return value type has been changed from void to org.apache.hadoop.hbase.client.Scan.
| This method has been removed because the return type is part of the method signature. |
hbase-common-0.98.9.jar,
AuthUtil
package org.apache.hadoop.hbase
[+] AuthUtil.AuthUtil ( ) (1)
[mangled: org/apache/hadoop/hbase/AuthUtil."<init>":()V]
| Change | Effect |
|---|
| 1 | Access level has been changed from public to private. | A client program may be interrupted by IllegalAccessError exception. |
to the top
Problems with Data Types, Medium Severity (5)
hbase-client-0.98.9.jar
package org.apache.hadoop.hbase.client
[+] HConnection (1)
| Change | Effect |
|---|
| 1 | Added super-interface Connection. | If abstract methods from an added super-interface must be implemented by client then it may be interrupted by AbstractMethodError exception. Abstract method getClient (org.apache.hadoop.hbase.ServerName) from the added super-interface is called by the method prepare ( boolean ) in 2nd library version and may not be implemented by old clients. |
[+] affected methods (59)
clearCaches ( org.apache.hadoop.hbase.ServerName )This abstract method is from 'HConnection' interface.
clearRegionCache ( )This abstract method is from 'HConnection' interface.
clearRegionCache ( byte[ ] )This abstract method is from 'HConnection' interface.
clearRegionCache ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
deleteCachedRegionLocation ( org.apache.hadoop.hbase.HRegionLocation )This abstract method is from 'HConnection' interface.
getAdmin ( org.apache.hadoop.hbase.ServerName )This abstract method is from 'HConnection' interface.
getAdmin ( org.apache.hadoop.hbase.ServerName, boolean )This abstract method is from 'HConnection' interface.
getClient ( org.apache.hadoop.hbase.ServerName )This abstract method is from 'HConnection' interface.
getConfiguration ( )This abstract method is from 'HConnection' interface.
getCurrentNrHRS ( )This abstract method is from 'HConnection' interface.
getHTableDescriptor ( byte[ ] )This abstract method is from 'HConnection' interface.
getHTableDescriptor ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
getHTableDescriptors ( java.util.List<java.lang.String> )This abstract method is from 'HConnection' interface.
getHTableDescriptorsByTableName ( java.util.List<org.apache.hadoop.hbase.TableName> )This abstract method is from 'HConnection' interface.
getKeepAliveMasterService ( )This abstract method is from 'HConnection' interface.
getMaster ( )This abstract method is from 'HConnection' interface.
getNonceGenerator ( )This abstract method is from 'HConnection' interface.
getRegionCachePrefetch ( byte[ ] )This abstract method is from 'HConnection' interface.
getRegionCachePrefetch ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
getRegionLocation ( byte[ ], byte[ ], boolean )This abstract method is from 'HConnection' interface.
getRegionLocation ( org.apache.hadoop.hbase.TableName, byte[ ], boolean )This abstract method is from 'HConnection' interface.
getTable ( byte[ ] )This abstract method is from 'HConnection' interface.
getTable ( byte[ ], java.util.concurrent.ExecutorService )This abstract method is from 'HConnection' interface.
getTable ( java.lang.String )This abstract method is from 'HConnection' interface.
getTable ( java.lang.String, java.util.concurrent.ExecutorService )This abstract method is from 'HConnection' interface.
getTable ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
getTable ( org.apache.hadoop.hbase.TableName, java.util.concurrent.ExecutorService )This abstract method is from 'HConnection' interface.
getTableNames ( )This abstract method is from 'HConnection' interface.
isClosed ( )This abstract method is from 'HConnection' interface.
isDeadServer ( org.apache.hadoop.hbase.ServerName )This abstract method is from 'HConnection' interface.
isMasterRunning ( )This abstract method is from 'HConnection' interface.
isTableAvailable ( byte[ ] )This abstract method is from 'HConnection' interface.
isTableAvailable ( byte[ ], byte[ ][ ] )This abstract method is from 'HConnection' interface.
isTableAvailable ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
isTableAvailable ( org.apache.hadoop.hbase.TableName, byte[ ][ ] )This abstract method is from 'HConnection' interface.
isTableDisabled ( byte[ ] )This abstract method is from 'HConnection' interface.
isTableDisabled ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
isTableEnabled ( byte[ ] )This abstract method is from 'HConnection' interface.
isTableEnabled ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
listTableNames ( )This abstract method is from 'HConnection' interface.
listTables ( )This abstract method is from 'HConnection' interface.
locateRegion ( byte[ ] )This abstract method is from 'HConnection' interface.
locateRegion ( byte[ ], byte[ ] )This abstract method is from 'HConnection' interface.
locateRegion ( org.apache.hadoop.hbase.TableName, byte[ ] )This abstract method is from 'HConnection' interface.
locateRegions ( byte[ ] )This abstract method is from 'HConnection' interface.
locateRegions ( byte[ ], boolean, boolean )This abstract method is from 'HConnection' interface.
locateRegions ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
locateRegions ( org.apache.hadoop.hbase.TableName, boolean, boolean )This abstract method is from 'HConnection' interface.
processBatch ( java.util.List<? extends Row>, byte[ ], java.util.concurrent.ExecutorService, java.lang.Object[ ] )This abstract method is from 'HConnection' interface.
processBatch ( java.util.List<? extends Row>, org.apache.hadoop.hbase.TableName, java.util.concurrent.ExecutorService, java.lang.Object[ ] )This abstract method is from 'HConnection' interface.
processBatchCallback ( java.util.List<? extends Row>, byte[ ], java.util.concurrent.ExecutorService, java.lang.Object[ ], coprocessor.Batch.Callback<R> )This abstract method is from 'HConnection' interface.
processBatchCallback ( java.util.List<? extends Row>, org.apache.hadoop.hbase.TableName, java.util.concurrent.ExecutorService, java.lang.Object[ ], coprocessor.Batch.Callback<R> )This abstract method is from 'HConnection' interface.
relocateRegion ( byte[ ], byte[ ] )This abstract method is from 'HConnection' interface.
relocateRegion ( org.apache.hadoop.hbase.TableName, byte[ ] )This abstract method is from 'HConnection' interface.
setRegionCachePrefetch ( byte[ ], boolean )This abstract method is from 'HConnection' interface.
setRegionCachePrefetch ( org.apache.hadoop.hbase.TableName, boolean )This abstract method is from 'HConnection' interface.
updateCachedLocations ( byte[ ], byte[ ], java.lang.Object, org.apache.hadoop.hbase.HRegionLocation )This abstract method is from 'HConnection' interface.
updateCachedLocations ( org.apache.hadoop.hbase.TableName, byte[ ], java.lang.Object, org.apache.hadoop.hbase.HRegionLocation )This abstract method is from 'HConnection' interface.
getConnection ( )Return value of this method has type 'HConnection'.
[+] HTableInterface (1)
| Change | Effect |
|---|
| 1 | Added super-interface Table. | If abstract methods from an added super-interface must be implemented by client then it may be interrupted by AbstractMethodError exception. Abstract method batchCoprocessorService (com.google.protobuf.Descriptors.MethodDescriptor, com.google.protobuf.Message, byte[], byte[], com.google.protobuf.Message, coprocessor.Batch.Callback) from the added super-interface is called by the method batchCoprocessorService ( com.google.protobuf.Descriptors.MethodDescriptor, com.google.protobuf.Message, byte[ ], byte[ ], R, coprocessor.Batch.Callback<R> ) in 2nd library version and may not be implemented by old clients. |
[+] affected methods (10)
getTable ( byte[ ] )Return value of this abstract method has type 'HTableInterface'.
getTable ( byte[ ], java.util.concurrent.ExecutorService )Return value of this abstract method has type 'HTableInterface'.
getTable ( java.lang.String )Return value of this abstract method has type 'HTableInterface'.
getTable ( java.lang.String, java.util.concurrent.ExecutorService )Return value of this abstract method has type 'HTableInterface'.
getTable ( org.apache.hadoop.hbase.TableName )Return value of this abstract method has type 'HTableInterface'.
getTable ( org.apache.hadoop.hbase.TableName, java.util.concurrent.ExecutorService )Return value of this abstract method has type 'HTableInterface'.
createHTableInterface ( org.apache.hadoop.conf.Configuration, byte[ ] )Return value of this method has type 'HTableInterface'.
releaseHTableInterface ( HTableInterface )1st parameter 'table' of this method has type 'HTableInterface'.
createHTableInterface ( org.apache.hadoop.conf.Configuration, byte[ ] )Return value of this abstract method has type 'HTableInterface'.
releaseHTableInterface ( HTableInterface )1st parameter 'p1' of this abstract method has type 'HTableInterface'.
[+] NoServerForRegionException (1)
| Change | Effect |
|---|
| 1 | Superclass has been changed from org.apache.hadoop.hbase.RegionException to DoNotRetryRegionException. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (2)
NoServerForRegionException ( )This constructor is from 'NoServerForRegionException' class.
NoServerForRegionException ( java.lang.String )This constructor is from 'NoServerForRegionException' class.
hbase-server-0.98.9.jar
package org.apache.hadoop.hbase.mapreduce
[+] ImportTsv (1)
| Change | Effect |
|---|
| 1 | Value of final field JOB_NAME_CONF_KEY (java.lang.String) has been changed from "mapred.job.name" to "mapreduce.job.name". | Old value of the field will be inlined to the client code at compile-time and will be used instead of a new one. |
[+] affected methods (4)
createSubmittableJob ( org.apache.hadoop.conf.Configuration, java.lang.String[ ] )This method is from 'ImportTsv' class.
ImportTsv ( )This constructor is from 'ImportTsv' class.
main ( java.lang.String[ ] )This method is from 'ImportTsv' class.
run ( java.lang.String[ ] )This method is from 'ImportTsv' class.
package org.apache.hadoop.hbase.master
[+] HMaster (1)
| Change | Effect |
|---|
| 1 | Superclass has been changed from org.apache.hadoop.hbase.util.HasThread to org.apache.hadoop.hbase.regionserver.HRegionServer. | 1) Access of a client program to the fields or methods of the old super-class may be interrupted by NoSuchFieldError or NoSuchMethodError exceptions. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
[+] affected methods (2)
getActiveMaster ( )Return value of this method has type 'HMaster'.
getMaster ( int )Return value of this method has type 'HMaster'.
to the top
Problems with Data Types, Low Severity (25)
hbase-client-0.98.9.jar
package org.apache.hadoop.hbase.client
[+] Append (2)
| Change | Effect |
|---|
| 1 | Method setTTL ( long ) has been overridden by setTTL ( long ) | Method setTTL ( long ) will be called instead of setTTL ( long ) in a client program. |
| 2 | Method setTTL ( long ) has been overridden by setTTL ( long ) | Method setTTL ( long ) will be called instead of setTTL ( long ) in a client program. |
[+] affected methods (1)
setTTL ( long )Method 'setTTL ( long )' will be called instead of this method in a client program.
[+] Put (2)
| Change | Effect |
|---|
| 1 | Method setTTL ( long ) has been overridden by setTTL ( long ) | Method setTTL ( long ) will be called instead of setTTL ( long ) in a client program. |
| 2 | Method setTTL ( long ) has been overridden by setTTL ( long ) | Method setTTL ( long ) will be called instead of setTTL ( long ) in a client program. |
[+] affected methods (1)
setTTL ( long )Method 'setTTL ( long )' will be called instead of this method in a client program.
package org.apache.hadoop.hbase.filter
[+] CompareFilter (1)
| Change | Effect |
|---|
| 1 | Method transformCell ( org.apache.hadoop.hbase.Cell ) has been overridden by transformCell ( org.apache.hadoop.hbase.Cell ) | Method transformCell ( org.apache.hadoop.hbase.Cell ) will be called instead of transformCell ( org.apache.hadoop.hbase.Cell ) in a client program. |
[+] affected methods (1)
transformCell ( org.apache.hadoop.hbase.Cell )Method 'transformCell ( org.apache.hadoop.hbase.Cell )' will be called instead of this method in a client program.
[+] FuzzyRowFilter (1)
| Change | Effect |
|---|
| 1 | Method transformCell ( org.apache.hadoop.hbase.Cell ) has been overridden by transformCell ( org.apache.hadoop.hbase.Cell ) | Method transformCell ( org.apache.hadoop.hbase.Cell ) will be called instead of transformCell ( org.apache.hadoop.hbase.Cell ) in a client program. |
[+] affected methods (1)
transformCell ( org.apache.hadoop.hbase.Cell )Method 'transformCell ( org.apache.hadoop.hbase.Cell )' will be called instead of this method in a client program.
[+] PrefixFilter (1)
| Change | Effect |
|---|
| 1 | Method transformCell ( org.apache.hadoop.hbase.Cell ) has been overridden by transformCell ( org.apache.hadoop.hbase.Cell ) | Method transformCell ( org.apache.hadoop.hbase.Cell ) will be called instead of transformCell ( org.apache.hadoop.hbase.Cell ) in a client program. |
[+] affected methods (1)
transformCell ( org.apache.hadoop.hbase.Cell )Method 'transformCell ( org.apache.hadoop.hbase.Cell )' will be called instead of this method in a client program.
hbase-server-0.98.9.jar
package org.apache.hadoop.hbase.mapreduce
[+] TableInputFormat (1)
| Change | Effect |
|---|
| 1 | Method getSplits ( org.apache.hadoop.mapreduce.JobContext ) has been overridden by getSplits ( org.apache.hadoop.mapreduce.JobContext ) | Method getSplits ( org.apache.hadoop.mapreduce.JobContext ) will be called instead of getSplits ( org.apache.hadoop.mapreduce.JobContext ) in a client program. |
[+] affected methods (1)
getSplits ( org.apache.hadoop.mapreduce.JobContext )Method 'getSplits ( org.apache.hadoop.mapreduce.JobContext )' will be called instead of this method in a client program.
package org.apache.hadoop.hbase.regionserver
[+] HRegionServer (3)
| Change | Effect |
|---|
| 1 | Added super-class org.apache.hadoop.hbase.util.HasThread. | A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
| 2 | Field REGIONSERVER_CONF (java.lang.String) with the compile-time constant value "regionserver_conf" has been removed from this class. | A client program may change behavior. |
| 3 | Field REGION_SERVER_RPC_SCHEDULER_FACTORY_CLASS (java.lang.String) with the compile-time constant value "hbase.region.server.rpc.scheduler.factory.class" has been removed from this class. | A client program may change behavior. |
[+] affected methods (1)
getRegionServer ( int )Return value of this method has type 'HRegionServer'.
[+] MemStoreFlusher (2)
| Change | Effect |
|---|
| 1 | Field globalMemStoreLimit became non-final. | Old value of the field will be inlined to the client code at compile-time and will be used instead of a new one. |
| 2 | Field globalMemStoreLimitLowMark became non-final. | Old value of the field will be inlined to the client code at compile-time and will be used instead of a new one. |
[+] affected methods (1)
getRegionServer ( int )Field 'retval.cacheFlusher' in return value of this method has type 'MemStoreFlusher'.
package org.apache.hadoop.hbase.regionserver.wal
[+] HLogPrettyPrinter (12)
| Change | Effect |
|---|
| 1 | Added super-class org.apache.hadoop.hbase.wal.WALPrettyPrinter. | A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class and cause IncompatibleClassChangeError exception. |
| 2 | Method beginPersistentOutput ( ) has been moved up type hierarchy to beginPersistentOutput ( ) | Method beginPersistentOutput ( ) will be called instead of beginPersistentOutput ( ) in a client program. |
| 3 | Method disableJSON ( ) has been moved up type hierarchy to disableJSON ( ) | Method disableJSON ( ) will be called instead of disableJSON ( ) in a client program. |
| 4 | Method disableValues ( ) has been moved up type hierarchy to disableValues ( ) | Method disableValues ( ) will be called instead of disableValues ( ) in a client program. |
| 5 | Method enableJSON ( ) has been moved up type hierarchy to enableJSON ( ) | Method enableJSON ( ) will be called instead of enableJSON ( ) in a client program. |
| 6 | Method enableValues ( ) has been moved up type hierarchy to enableValues ( ) | Method enableValues ( ) will be called instead of enableValues ( ) in a client program. |
| 7 | Method endPersistentOutput ( ) has been moved up type hierarchy to endPersistentOutput ( ) | Method endPersistentOutput ( ) will be called instead of endPersistentOutput ( ) in a client program. |
| 8 | Method processFile ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.fs.Path ) has been moved up type hierarchy to processFile ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.fs.Path ) | Method processFile ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.fs.Path ) will be called instead of processFile ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.fs.Path ) in a client program. |
| 9 | Method run ( java.lang.String[ ] ) has been moved up type hierarchy to run ( java.lang.String[ ] ) | Method run ( java.lang.String[ ] ) will be called instead of run ( java.lang.String[ ] ) in a client program. |
| 10 | Method setRegionFilter ( java.lang.String ) has been moved up type hierarchy to setRegionFilter ( java.lang.String ) | Method setRegionFilter ( java.lang.String ) will be called instead of setRegionFilter ( java.lang.String ) in a client program. |
| 11 | Method setRowFilter ( java.lang.String ) has been moved up type hierarchy to setRowFilter ( java.lang.String ) | Method setRowFilter ( java.lang.String ) will be called instead of setRowFilter ( java.lang.String ) in a client program. |
| 12 | Method setSequenceFilter ( long ) has been moved up type hierarchy to setSequenceFilter ( long ) | Method setSequenceFilter ( long ) will be called instead of setSequenceFilter ( long ) in a client program. |
[+] affected methods (14)
beginPersistentOutput ( )Method 'beginPersistentOutput ( )' will be called instead of this method in a client program.
disableJSON ( )Method 'disableJSON ( )' will be called instead of this method in a client program.
disableValues ( )Method 'disableValues ( )' will be called instead of this method in a client program.
enableJSON ( )Method 'enableJSON ( )' will be called instead of this method in a client program.
enableValues ( )Method 'enableValues ( )' will be called instead of this method in a client program.
endPersistentOutput ( )Method 'endPersistentOutput ( )' will be called instead of this method in a client program.
HLogPrettyPrinter ( )This constructor is from 'HLogPrettyPrinter' class.
HLogPrettyPrinter ( boolean, boolean, long, java.lang.String, java.lang.String, boolean, java.io.PrintStream )This constructor is from 'HLogPrettyPrinter' class.
main ( java.lang.String[ ] )This method is from 'HLogPrettyPrinter' class.
processFile ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.fs.Path )Method 'processFile ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.fs.Path )' will be called instead of this method in a client program.
run ( java.lang.String[ ] )Method 'run ( java.lang.String[ ] )' will be called instead of this method in a client program.
setRegionFilter ( java.lang.String )Method 'setRegionFilter ( java.lang.String )' will be called instead of this method in a client program.
setRowFilter ( java.lang.String )Method 'setRowFilter ( java.lang.String )' will be called instead of this method in a client program.
setSequenceFilter ( long )Method 'setSequenceFilter ( long )' will be called instead of this method in a client program.
to the top
Other Changes in Data Types (34)
hbase-client-0.98.9.jar
package org.apache.hadoop.hbase
[+] HRegionInfo (2)
| Change | Effect |
|---|
| 1 | Field REPLICA_ID_DELIMITER has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
| 2 | Field REPLICA_ID_FORMAT has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
[+] affected methods (75)
checkScanStopRow ( byte[ ] )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
close ( )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
getScan ( )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
getTable ( )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
getTableName ( )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
getTimestamp ( )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
initializeScannerInConstruction ( )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
next ( )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
nextScanner ( int, boolean )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
writeScanMetrics ( )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
areAdjacent ( HRegionInfo, HRegionInfo )This method is from 'HRegionInfo' class.
compareTo ( java.lang.Object )This method is from 'HRegionInfo' class.
compareTo ( HRegionInfo )This method is from 'HRegionInfo' class.
containsRange ( byte[ ], byte[ ] )This method is from 'HRegionInfo' class.
containsRow ( byte[ ] )This method is from 'HRegionInfo' class.
convert ( HRegionInfo )This method is from 'HRegionInfo' class.
convert ( protobuf.generated.HBaseProtos.RegionInfo )This method is from 'HRegionInfo' class.
createRegionName ( TableName, byte[ ], byte[ ], boolean )This method is from 'HRegionInfo' class.
createRegionName ( TableName, byte[ ], java.lang.String, boolean )This method is from 'HRegionInfo' class.
createRegionName ( TableName, byte[ ], long, boolean )This method is from 'HRegionInfo' class.
encodeRegionName ( byte[ ] )This method is from 'HRegionInfo' class.
equals ( java.lang.Object )This method is from 'HRegionInfo' class.
getComparator ( )This method is from 'HRegionInfo' class.
getDaughterRegions ( client.Result )This method is from 'HRegionInfo' class.
getEncodedName ( )This method is from 'HRegionInfo' class.
getEncodedNameAsBytes ( )This method is from 'HRegionInfo' class.
getEndKey ( )This method is from 'HRegionInfo' class.
getHRegionInfo ( client.Result )This method is from 'HRegionInfo' class.
getHRegionInfo ( client.Result, byte[ ] )This method is from 'HRegionInfo' class.
getHRegionInfoAndServerName ( client.Result )This method is from 'HRegionInfo' class.
getMergeRegions ( client.Result )This method is from 'HRegionInfo' class.
getRegionId ( )This method is from 'HRegionInfo' class.
getRegionName ( )This method is from 'HRegionInfo' class.
getRegionNameAsString ( )This method is from 'HRegionInfo' class.
getSeqNumDuringOpen ( client.Result )This method is from 'HRegionInfo' class.
getServerName ( client.Result )This method is from 'HRegionInfo' class.
getShortNameToLog ( )This method is from 'HRegionInfo' class.
getStartKey ( )This method is from 'HRegionInfo' class.
getStartKey ( byte[ ] )This method is from 'HRegionInfo' class.
getTable ( )This method is from 'HRegionInfo' class.
getTable ( byte[ ] )This method is from 'HRegionInfo' class.
getTableName ( )This method is from 'HRegionInfo' class.
getTableName ( byte[ ] )This method is from 'HRegionInfo' class.
getVersion ( )This method is from 'HRegionInfo' class.
hashCode ( )This method is from 'HRegionInfo' class.
HRegionInfo ( )This constructor is from 'HRegionInfo' class.
HRegionInfo ( HRegionInfo )This constructor is from 'HRegionInfo' class.
HRegionInfo ( TableName )This constructor is from 'HRegionInfo' class.
HRegionInfo ( TableName, byte[ ], byte[ ] )This constructor is from 'HRegionInfo' class.
HRegionInfo ( TableName, byte[ ], byte[ ], boolean )This constructor is from 'HRegionInfo' class.
HRegionInfo ( TableName, byte[ ], byte[ ], boolean, long )This constructor is from 'HRegionInfo' class.
isMetaRegion ( )This method is from 'HRegionInfo' class.
isMetaTable ( )This method is from 'HRegionInfo' class.
isOffline ( )This method is from 'HRegionInfo' class.
isSplit ( )This method is from 'HRegionInfo' class.
isSplitParent ( )This method is from 'HRegionInfo' class.
parseDelimitedFrom ( byte[ ], int, int )This method is from 'HRegionInfo' class.
parseFrom ( byte[ ] )This method is from 'HRegionInfo' class.
parseFrom ( byte[ ], int, int )This method is from 'HRegionInfo' class.
parseFrom ( java.io.DataInputStream )This method is from 'HRegionInfo' class.
parseFromOrNull ( byte[ ] )This method is from 'HRegionInfo' class.
parseFromOrNull ( byte[ ], int, int )This method is from 'HRegionInfo' class.
parseRegionName ( byte[ ] )This method is from 'HRegionInfo' class.
prettyPrint ( java.lang.String )This method is from 'HRegionInfo' class.
readFields ( java.io.DataInput )This method is from 'HRegionInfo' class.
setOffline ( boolean )This method is from 'HRegionInfo' class.
setSplit ( boolean )This method is from 'HRegionInfo' class.
toByteArray ( )This method is from 'HRegionInfo' class.
toDelimitedByteArray ( )This method is from 'HRegionInfo' class.
toDelimitedByteArray ( HRegionInfo... )This method is from 'HRegionInfo' class.
toString ( )This method is from 'HRegionInfo' class.
write ( java.io.DataOutput )This method is from 'HRegionInfo' class.
getRegionInfo ( )Return value of this method has type 'HRegionInfo'.
HRegionLocation ( HRegionInfo, ServerName )1st parameter 'regionInfo' of this method has type 'HRegionInfo'.
HRegionLocation ( HRegionInfo, ServerName, long )1st parameter 'regionInfo' of this method has type 'HRegionInfo'.
[+] HTableDescriptor (2)
| Change | Effect |
|---|
| 1 | Field DEFAULT_REGION_REPLICATION has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
| 2 | Field REGION_REPLICATION has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
[+] affected methods (3)
getHTableDescriptor ( byte[ ] )Return value of this abstract method has type 'HTableDescriptor'.
getHTableDescriptor ( TableName )Return value of this abstract method has type 'HTableDescriptor'.
getTableDescriptor ( )Return value of this method has type 'HTableDescriptor'.
package org.apache.hadoop.hbase.client
[+] Attributes (1)
| Change | Effect |
|---|
| 1 | Abstract method setAttribute ( java.lang.String, byte[ ] ) has been added to this interface. | No effect. |
[+] affected methods (2)
getAttribute ( java.lang.String )This abstract method is from 'Attributes' interface.
getAttributesMap ( )This abstract method is from 'Attributes' interface.
[+] ClientScanner (3)
| Change | Effect |
|---|
| 1 | Field conf has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
| 2 | Field pool has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
| 3 | Field primaryOperationTimeout has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
[+] affected methods (10)
checkScanStopRow ( byte[ ] )This method is from 'ClientScanner' class.
close ( )This method is from 'ClientScanner' class.
getScan ( )This method is from 'ClientScanner' class.
getTable ( )This method is from 'ClientScanner' class.
getTableName ( )This method is from 'ClientScanner' class.
getTimestamp ( )This method is from 'ClientScanner' class.
initializeScannerInConstruction ( )This method is from 'ClientScanner' class.
next ( )This method is from 'ClientScanner' class.
nextScanner ( int, boolean )This method is from 'ClientScanner' class.
writeScanMetrics ( )This method is from 'ClientScanner' class.
[+] HConnection (3)
| Change | Effect |
|---|
| 1 | Abstract method getAdmin ( ) has been added to this interface. | No effect. |
| 2 | Abstract method getRegionLocator ( org.apache.hadoop.hbase.TableName ) has been added to this interface. | No effect. |
| 3 | Abstract method updateCachedLocations ( org.apache.hadoop.hbase.TableName, byte[ ], byte[ ], java.lang.Object, org.apache.hadoop.hbase.ServerName ) has been added to this interface. | No effect. |
[+] affected methods (59)
clearCaches ( org.apache.hadoop.hbase.ServerName )This abstract method is from 'HConnection' interface.
clearRegionCache ( )This abstract method is from 'HConnection' interface.
clearRegionCache ( byte[ ] )This abstract method is from 'HConnection' interface.
clearRegionCache ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
deleteCachedRegionLocation ( org.apache.hadoop.hbase.HRegionLocation )This abstract method is from 'HConnection' interface.
getAdmin ( org.apache.hadoop.hbase.ServerName )This abstract method is from 'HConnection' interface.
getAdmin ( org.apache.hadoop.hbase.ServerName, boolean )This abstract method is from 'HConnection' interface.
getClient ( org.apache.hadoop.hbase.ServerName )This abstract method is from 'HConnection' interface.
getConfiguration ( )This abstract method is from 'HConnection' interface.
getCurrentNrHRS ( )This abstract method is from 'HConnection' interface.
getHTableDescriptor ( byte[ ] )This abstract method is from 'HConnection' interface.
getHTableDescriptor ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
getHTableDescriptors ( java.util.List<java.lang.String> )This abstract method is from 'HConnection' interface.
getHTableDescriptorsByTableName ( java.util.List<org.apache.hadoop.hbase.TableName> )This abstract method is from 'HConnection' interface.
getKeepAliveMasterService ( )This abstract method is from 'HConnection' interface.
getMaster ( )This abstract method is from 'HConnection' interface.
getNonceGenerator ( )This abstract method is from 'HConnection' interface.
getRegionCachePrefetch ( byte[ ] )This abstract method is from 'HConnection' interface.
getRegionCachePrefetch ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
getRegionLocation ( byte[ ], byte[ ], boolean )This abstract method is from 'HConnection' interface.
getRegionLocation ( org.apache.hadoop.hbase.TableName, byte[ ], boolean )This abstract method is from 'HConnection' interface.
getTable ( byte[ ] )This abstract method is from 'HConnection' interface.
getTable ( byte[ ], java.util.concurrent.ExecutorService )This abstract method is from 'HConnection' interface.
getTable ( java.lang.String )This abstract method is from 'HConnection' interface.
getTable ( java.lang.String, java.util.concurrent.ExecutorService )This abstract method is from 'HConnection' interface.
getTable ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
getTable ( org.apache.hadoop.hbase.TableName, java.util.concurrent.ExecutorService )This abstract method is from 'HConnection' interface.
getTableNames ( )This abstract method is from 'HConnection' interface.
isClosed ( )This abstract method is from 'HConnection' interface.
isDeadServer ( org.apache.hadoop.hbase.ServerName )This abstract method is from 'HConnection' interface.
isMasterRunning ( )This abstract method is from 'HConnection' interface.
isTableAvailable ( byte[ ] )This abstract method is from 'HConnection' interface.
isTableAvailable ( byte[ ], byte[ ][ ] )This abstract method is from 'HConnection' interface.
isTableAvailable ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
isTableAvailable ( org.apache.hadoop.hbase.TableName, byte[ ][ ] )This abstract method is from 'HConnection' interface.
isTableDisabled ( byte[ ] )This abstract method is from 'HConnection' interface.
isTableDisabled ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
isTableEnabled ( byte[ ] )This abstract method is from 'HConnection' interface.
isTableEnabled ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
listTableNames ( )This abstract method is from 'HConnection' interface.
listTables ( )This abstract method is from 'HConnection' interface.
locateRegion ( byte[ ] )This abstract method is from 'HConnection' interface.
locateRegion ( byte[ ], byte[ ] )This abstract method is from 'HConnection' interface.
locateRegion ( org.apache.hadoop.hbase.TableName, byte[ ] )This abstract method is from 'HConnection' interface.
locateRegions ( byte[ ] )This abstract method is from 'HConnection' interface.
locateRegions ( byte[ ], boolean, boolean )This abstract method is from 'HConnection' interface.
locateRegions ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
locateRegions ( org.apache.hadoop.hbase.TableName, boolean, boolean )This abstract method is from 'HConnection' interface.
processBatch ( java.util.List<? extends Row>, byte[ ], java.util.concurrent.ExecutorService, java.lang.Object[ ] )This abstract method is from 'HConnection' interface.
processBatch ( java.util.List<? extends Row>, org.apache.hadoop.hbase.TableName, java.util.concurrent.ExecutorService, java.lang.Object[ ] )This abstract method is from 'HConnection' interface.
processBatchCallback ( java.util.List<? extends Row>, byte[ ], java.util.concurrent.ExecutorService, java.lang.Object[ ], coprocessor.Batch.Callback<R> )This abstract method is from 'HConnection' interface.
processBatchCallback ( java.util.List<? extends Row>, org.apache.hadoop.hbase.TableName, java.util.concurrent.ExecutorService, java.lang.Object[ ], coprocessor.Batch.Callback<R> )This abstract method is from 'HConnection' interface.
relocateRegion ( byte[ ], byte[ ] )This abstract method is from 'HConnection' interface.
relocateRegion ( org.apache.hadoop.hbase.TableName, byte[ ] )This abstract method is from 'HConnection' interface.
setRegionCachePrefetch ( byte[ ], boolean )This abstract method is from 'HConnection' interface.
setRegionCachePrefetch ( org.apache.hadoop.hbase.TableName, boolean )This abstract method is from 'HConnection' interface.
updateCachedLocations ( byte[ ], byte[ ], java.lang.Object, org.apache.hadoop.hbase.HRegionLocation )This abstract method is from 'HConnection' interface.
updateCachedLocations ( org.apache.hadoop.hbase.TableName, byte[ ], java.lang.Object, org.apache.hadoop.hbase.HRegionLocation )This abstract method is from 'HConnection' interface.
getConnection ( )Return value of this method has type 'HConnection'.
[+] HTable (1)
| Change | Effect |
|---|
| 1 | Field multiAp has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
[+] affected methods (86)
append ( Append )This method is from 'HTable' class.
batch ( java.util.List<? extends Row> )This method is from 'HTable' class.
batch ( java.util.List<? extends Row>, java.lang.Object[ ] )This method is from 'HTable' class.
batchCallback ( java.util.List<? extends Row>, java.lang.Object[ ], coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
batchCallback ( java.util.List<? extends Row>, coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
batchCoprocessorService ( com.google.protobuf.Descriptors.MethodDescriptor, com.google.protobuf.Message, byte[ ], byte[ ], R )This method is from 'HTable' class.
batchCoprocessorService ( com.google.protobuf.Descriptors.MethodDescriptor, com.google.protobuf.Message, byte[ ], byte[ ], R, coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
checkAndDelete ( byte[ ], byte[ ], byte[ ], byte[ ], Delete )This method is from 'HTable' class.
checkAndMutate ( byte[ ], byte[ ], byte[ ], org.apache.hadoop.hbase.filter.CompareFilter.CompareOp, byte[ ], RowMutations )This method is from 'HTable' class.
checkAndPut ( byte[ ], byte[ ], byte[ ], byte[ ], Put )This method is from 'HTable' class.
clearRegionCache ( )This method is from 'HTable' class.
close ( )This method is from 'HTable' class.
coprocessorService ( byte[ ] )This method is from 'HTable' class.
coprocessorService ( java.lang.Class<T>, byte[ ], byte[ ], coprocessor.Batch.Call<T,R> )This method is from 'HTable' class.
coprocessorService ( java.lang.Class<T>, byte[ ], byte[ ], coprocessor.Batch.Call<T,R>, coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
delete ( java.util.List<Delete> )This method is from 'HTable' class.
delete ( Delete )This method is from 'HTable' class.
exists ( java.util.List<Get> )This method is from 'HTable' class.
exists ( Get )This method is from 'HTable' class.
flushCommits ( )This method is from 'HTable' class.
get ( java.util.List<Get> )This method is from 'HTable' class.
get ( Get )This method is from 'HTable' class.
getConfiguration ( )This method is from 'HTable' class.
getConnection ( )This method is from 'HTable' class.
getDefaultExecutor ( org.apache.hadoop.conf.Configuration )This method is from 'HTable' class.
getEndKeys ( )This method is from 'HTable' class.
getMaxKeyValueSize ( org.apache.hadoop.conf.Configuration )This method is from 'HTable' class.
getName ( )This method is from 'HTable' class.
getOperationTimeout ( )This method is from 'HTable' class.
getRegionCachePrefetch ( byte[ ] )This method is from 'HTable' class.
getRegionCachePrefetch ( org.apache.hadoop.conf.Configuration, byte[ ] )This method is from 'HTable' class.
getRegionCachePrefetch ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName )This method is from 'HTable' class.
getRegionCachePrefetch ( org.apache.hadoop.hbase.TableName )This method is from 'HTable' class.
getRegionLocation ( byte[ ] )This method is from 'HTable' class.
getRegionLocation ( byte[ ], boolean )This method is from 'HTable' class.
getRegionLocation ( java.lang.String )This method is from 'HTable' class.
getRegionLocations ( )This method is from 'HTable' class.
getRegionsInRange ( byte[ ], byte[ ] )This method is from 'HTable' class.
getRegionsInRange ( byte[ ], byte[ ], boolean )This method is from 'HTable' class.
getRowOrBefore ( byte[ ], byte[ ] )This method is from 'HTable' class.
getScanner ( byte[ ] )This method is from 'HTable' class.
getScanner ( byte[ ], byte[ ] )This method is from 'HTable' class.
getScanner ( Scan )This method is from 'HTable' class.
getScannerCaching ( )This method is from 'HTable' class.
getStartEndKeys ( )This method is from 'HTable' class.
getStartKeys ( )This method is from 'HTable' class.
getTableDescriptor ( )This method is from 'HTable' class.
getTableName ( )This method is from 'HTable' class.
getWriteBuffer ( )This method is from 'HTable' class.
getWriteBufferSize ( )This method is from 'HTable' class.
HTable ( )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, byte[ ] )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, byte[ ], java.util.concurrent.ExecutorService )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, java.lang.String )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName, java.util.concurrent.ExecutorService )This constructor is from 'HTable' class.
increment ( Increment )This method is from 'HTable' class.
incrementColumnValue ( byte[ ], byte[ ], byte[ ], long )This method is from 'HTable' class.
incrementColumnValue ( byte[ ], byte[ ], byte[ ], long, boolean )This method is from 'HTable' class.
incrementColumnValue ( byte[ ], byte[ ], byte[ ], long, Durability )This method is from 'HTable' class.
isAutoFlush ( )This method is from 'HTable' class.
isTableEnabled ( byte[ ] )This method is from 'HTable' class.
isTableEnabled ( java.lang.String )This method is from 'HTable' class.
isTableEnabled ( org.apache.hadoop.conf.Configuration, byte[ ] )This method is from 'HTable' class.
isTableEnabled ( org.apache.hadoop.conf.Configuration, java.lang.String )This method is from 'HTable' class.
isTableEnabled ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName )This method is from 'HTable' class.
isTableEnabled ( org.apache.hadoop.hbase.TableName )This method is from 'HTable' class.
mutateRow ( RowMutations )This method is from 'HTable' class.
processBatch ( java.util.List<? extends Row>, java.lang.Object[ ] )This method is from 'HTable' class.
processBatchCallback ( java.util.List<? extends Row>, java.lang.Object[ ], coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
put ( java.util.List<Put> )This method is from 'HTable' class.
put ( Put )This method is from 'HTable' class.
setAutoFlush ( boolean )This method is from 'HTable' class.
setAutoFlush ( boolean, boolean )This method is from 'HTable' class.
setAutoFlushTo ( boolean )This method is from 'HTable' class.
setOperationTimeout ( int )This method is from 'HTable' class.
setRegionCachePrefetch ( byte[ ], boolean )This method is from 'HTable' class.
setRegionCachePrefetch ( org.apache.hadoop.conf.Configuration, byte[ ], boolean )This method is from 'HTable' class.
setRegionCachePrefetch ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName, boolean )This method is from 'HTable' class.
setRegionCachePrefetch ( org.apache.hadoop.hbase.TableName, boolean )This method is from 'HTable' class.
setScannerCaching ( int )This method is from 'HTable' class.
setWriteBufferSize ( long )This method is from 'HTable' class.
toString ( )This method is from 'HTable' class.
validatePut ( Put )This method is from 'HTable' class.
validatePut ( Put, int )This method is from 'HTable' class.
configureIncrementalLoad ( org.apache.hadoop.mapreduce.Job, HTable )2nd parameter 'table' of this method has type 'HTable'.
[+] Query (2)
| Change | Effect |
|---|
| 1 | Field consistency has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
| 2 | Field targetReplicaId has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
[+] affected methods (1)
setFilter ( org.apache.hadoop.hbase.filter.Filter )Return value of this method has type 'Query'.
hbase-common-0.98.9.jar
package org.apache.hadoop.hbase
[+] HBaseInterfaceAudience (1)
| Change | Effect |
|---|
| 1 | Field TOOLS has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
[+] affected methods (1)
HBaseInterfaceAudience ( )This constructor is from 'HBaseInterfaceAudience' class.
hbase-protocol-0.98.9.jar
package org.apache.hadoop.hbase.protobuf.generated
[+] HBaseProtos.RegionInfo (1)
| Change | Effect |
|---|
| 1 | Field REPLICA_ID_FIELD_NUMBER has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
[+] affected methods (2)
convert ( org.apache.hadoop.hbase.HRegionInfo )Return value of this method has type 'HBaseProtos.RegionInfo'.
convert ( HBaseProtos.RegionInfo )1st parameter 'proto' of this method has type 'HBaseProtos.RegionInfo'.
hbase-server-0.98.9.jar
package org.apache.hadoop.hbase.mapreduce
[+] ImportTsv (1)
| Change | Effect |
|---|
| 1 | Field NO_STRICT_COL_FAMILY has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
[+] affected methods (4)
createSubmittableJob ( org.apache.hadoop.conf.Configuration, java.lang.String[ ] )This method is from 'ImportTsv' class.
ImportTsv ( )This constructor is from 'ImportTsv' class.
main ( java.lang.String[ ] )This method is from 'ImportTsv' class.
run ( java.lang.String[ ] )This method is from 'ImportTsv' class.
[+] TableInputFormat (1)
| Change | Effect |
|---|
| 1 | Field SHUFFLE_MAPS has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
[+] affected methods (6)
addColumns ( org.apache.hadoop.hbase.client.Scan, byte[ ][ ] )This method is from 'TableInputFormat' class.
configureSplitTable ( org.apache.hadoop.mapreduce.Job, org.apache.hadoop.hbase.TableName )This method is from 'TableInputFormat' class.
getConf ( )This method is from 'TableInputFormat' class.
getStartEndKeys ( )This method is from 'TableInputFormat' class.
setConf ( org.apache.hadoop.conf.Configuration )This method is from 'TableInputFormat' class.
TableInputFormat ( )This constructor is from 'TableInputFormat' class.
package org.apache.hadoop.hbase.regionserver
[+] CompactSplitThread (8)
| Change | Effect |
|---|
| 1 | Field LARGE_COMPACTION_THREADS has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
| 2 | Field LARGE_COMPACTION_THREADS_DEFAULT has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
| 3 | Field MERGE_THREADS has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
| 4 | Field MERGE_THREADS_DEFAULT has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
| 5 | Field SMALL_COMPACTION_THREADS has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
| 6 | Field SMALL_COMPACTION_THREADS_DEFAULT has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
| 7 | Field SPLIT_THREADS has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
| 8 | Field SPLIT_THREADS_DEFAULT has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
[+] affected methods (1)
getRegionServer ( int )Field 'retval.compactSplitThread' in return value of this method has type 'CompactSplitThread'.
[+] HRegionServer (7)
| Change | Effect |
|---|
| 1 | Field clusterConnection has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
| 2 | Field configurationManager has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
| 3 | Field csm has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
| 4 | Field hMemManager has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
| 5 | Field metaTableLocator has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
| 6 | Field rpcServices has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
| 7 | Field walFactory has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
[+] affected methods (1)
getRegionServer ( int )Return value of this method has type 'HRegionServer'.
[+] MemStoreFlusher (1)
| Change | Effect |
|---|
| 1 | Field globalMemStoreLimitLowMarkPercent has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class and cause IncompatibleClassChangeError exception. |
[+] affected methods (1)
getRegionServer ( int )Field 'retval.cacheFlusher' in return value of this method has type 'MemStoreFlusher'.
to the top
Java ARchives (15)
hbase-annotations-0.98.9.jar
hbase-checkstyle-0.98.9.jar
hbase-client-0.98.9.jar
hbase-common-0.98.9.jar
hbase-examples-0.98.9.jar
hbase-hadoop-compat-0.98.9.jar
hbase-hadoop2-compat-0.98.9.jar
hbase-it-0.98.9.jar
hbase-prefix-tree-0.98.9.jar
hbase-protocol-0.98.9.jar
hbase-rest-0.98.9.jar
hbase-server-0.98.9.jar
hbase-shell-0.98.9.jar
hbase-testing-util-0.98.9.jar
hbase-thrift-0.98.9.jar
to the top
Test Info
| Library Name | HBase |
| Version #1 | 0.98.9 |
| Version #2 | branch-1.0 |
| Java Version | 1.7.0_60 |
| Subject | Source Compatibility |
Test Results
| Total Java ARchives | 15 |
|---|
| Total Methods / Classes | 1572 / 3953 |
|---|
| Verdict | Incompatible (14.6%) |
Problem Summary
| Severity | Count |
|---|
| Added Methods | - | 186 |
|---|
| Removed Methods | High | 34 |
|---|
Problems with Data Types | High | 29 |
|---|
| Medium | 2 |
| Low | 2 |
Problems with Methods | High | 1 |
|---|
| Medium | 0 |
| Low | 0 |
Other Changes in Data Types | - | 30 |
Added Methods (186)
hbase-client-1.0.0.jar,
AccessControlClient.class
package org.apache.hadoop.hbase.security.access
AccessControlClient.getUserPermissions ( org.apache.hadoop.hbase.client.Connection connection, String tableRegex ) [static] : java.util.List<UserPermission>
[mangled: org/apache/hadoop/hbase/security/access/AccessControlClient.getUserPermissions:(Lorg/apache/hadoop/hbase/client/Connection;Ljava/lang/String;)Ljava/util/List;]
hbase-client-1.0.0.jar,
Append.class
package org.apache.hadoop.hbase.client
Append.setACL ( java.util.Map x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Append.setACL:(Ljava/util/Map;)Lorg/apache/hadoop/hbase/client/Mutation;]
Append.setACL ( java.util.Map<String,org.apache.hadoop.hbase.security.access.Permission> perms ) : Append
[mangled: org/apache/hadoop/hbase/client/Append.setACL:(Ljava/util/Map;)Lorg/apache/hadoop/hbase/client/Append;]
Append.setACL ( String user, org.apache.hadoop.hbase.security.access.Permission perms ) : Append
[mangled: org/apache/hadoop/hbase/client/Append.setACL:(Ljava/lang/String;Lorg/apache/hadoop/hbase/security/access/Permission;)Lorg/apache/hadoop/hbase/client/Append;]
Append.setACL ( String x0, org.apache.hadoop.hbase.security.access.Permission x1 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Append.setACL:(Ljava/lang/String;Lorg/apache/hadoop/hbase/security/access/Permission;)Lorg/apache/hadoop/hbase/client/Mutation;]
Append.setAttribute ( String name, byte[ ] value ) : Append
[mangled: org/apache/hadoop/hbase/client/Append.setAttribute:(Ljava/lang/String;[B)Lorg/apache/hadoop/hbase/client/Append;]
Append.setAttribute ( String x0, byte[ ] x1 ) : Attributes
[mangled: org/apache/hadoop/hbase/client/Append.setAttribute:(Ljava/lang/String;[B)Lorg/apache/hadoop/hbase/client/Attributes;]
Append.setAttribute ( String x0, byte[ ] x1 ) : OperationWithAttributes
[mangled: org/apache/hadoop/hbase/client/Append.setAttribute:(Ljava/lang/String;[B)Lorg/apache/hadoop/hbase/client/OperationWithAttributes;]
Append.setCellVisibility ( org.apache.hadoop.hbase.security.visibility.CellVisibility expression ) : Append
[mangled: org/apache/hadoop/hbase/client/Append.setCellVisibility:(Lorg/apache/hadoop/hbase/security/visibility/CellVisibility;)Lorg/apache/hadoop/hbase/client/Append;]
Append.setCellVisibility ( org.apache.hadoop.hbase.security.visibility.CellVisibility x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Append.setCellVisibility:(Lorg/apache/hadoop/hbase/security/visibility/CellVisibility;)Lorg/apache/hadoop/hbase/client/Mutation;]
Append.setClusterIds ( java.util.List x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Append.setClusterIds:(Ljava/util/List;)Lorg/apache/hadoop/hbase/client/Mutation;]
Append.setClusterIds ( java.util.List<java.util.UUID> clusterIds ) : Append
[mangled: org/apache/hadoop/hbase/client/Append.setClusterIds:(Ljava/util/List;)Lorg/apache/hadoop/hbase/client/Append;]
Append.setDurability ( Durability d ) : Append
[mangled: org/apache/hadoop/hbase/client/Append.setDurability:(Lorg/apache/hadoop/hbase/client/Durability;)Lorg/apache/hadoop/hbase/client/Append;]
Append.setDurability ( Durability x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Append.setDurability:(Lorg/apache/hadoop/hbase/client/Durability;)Lorg/apache/hadoop/hbase/client/Mutation;]
Append.setFamilyCellMap ( java.util.NavigableMap x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Append.setFamilyCellMap:(Ljava/util/NavigableMap;)Lorg/apache/hadoop/hbase/client/Mutation;]
Append.setFamilyCellMap ( java.util.NavigableMap<byte[ ],java.util.List<org.apache.hadoop.hbase.Cell>> map ) : Append
[mangled: org/apache/hadoop/hbase/client/Append.setFamilyCellMap:(Ljava/util/NavigableMap;)Lorg/apache/hadoop/hbase/client/Append;]
Append.setFamilyMap ( java.util.NavigableMap x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Append.setFamilyMap:(Ljava/util/NavigableMap;)Lorg/apache/hadoop/hbase/client/Mutation;]
Append.setFamilyMap ( java.util.NavigableMap<byte[ ],java.util.List<org.apache.hadoop.hbase.KeyValue>> map ) : Append *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/Append.setFamilyMap:(Ljava/util/NavigableMap;)Lorg/apache/hadoop/hbase/client/Append;]
Append.setId ( String id ) : Append
[mangled: org/apache/hadoop/hbase/client/Append.setId:(Ljava/lang/String;)Lorg/apache/hadoop/hbase/client/Append;]
Append.setId ( String x0 ) : OperationWithAttributes
[mangled: org/apache/hadoop/hbase/client/Append.setId:(Ljava/lang/String;)Lorg/apache/hadoop/hbase/client/OperationWithAttributes;]
Append.setTTL ( long ttl ) : Append
[mangled: org/apache/hadoop/hbase/client/Append.setTTL:(J)Lorg/apache/hadoop/hbase/client/Append;]
Append.setTTL ( long x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Append.setTTL:(J)Lorg/apache/hadoop/hbase/client/Mutation;]
Append.setWriteToWAL ( boolean write ) : Append *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/Append.setWriteToWAL:(Z)Lorg/apache/hadoop/hbase/client/Append;]
Append.setWriteToWAL ( boolean x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Append.setWriteToWAL:(Z)Lorg/apache/hadoop/hbase/client/Mutation;]
hbase-client-1.0.0.jar,
CompareFilter.class
package org.apache.hadoop.hbase.filter
CompareFilter.transformCell ( org.apache.hadoop.hbase.Cell v ) : org.apache.hadoop.hbase.Cell
[mangled: org/apache/hadoop/hbase/filter/CompareFilter.transformCell:(Lorg/apache/hadoop/hbase/Cell;)Lorg/apache/hadoop/hbase/Cell;]
hbase-client-1.0.0.jar,
Connection.class
package org.apache.hadoop.hbase.client
Connection.close ( ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Connection.close:()V]
Connection.getAdmin ( ) [abstract] : Admin
[mangled: org/apache/hadoop/hbase/client/Connection.getAdmin:()Lorg/apache/hadoop/hbase/client/Admin;]
Connection.getConfiguration ( ) [abstract] : org.apache.hadoop.conf.Configuration
[mangled: org/apache/hadoop/hbase/client/Connection.getConfiguration:()Lorg/apache/hadoop/conf/Configuration;]
Connection.getRegionLocator ( org.apache.hadoop.hbase.TableName p1 ) [abstract] : RegionLocator
[mangled: org/apache/hadoop/hbase/client/Connection.getRegionLocator:(Lorg/apache/hadoop/hbase/TableName;)Lorg/apache/hadoop/hbase/client/RegionLocator;]
Connection.getTable ( org.apache.hadoop.hbase.TableName p1 ) [abstract] : Table
[mangled: org/apache/hadoop/hbase/client/Connection.getTable:(Lorg/apache/hadoop/hbase/TableName;)Lorg/apache/hadoop/hbase/client/Table;]
Connection.getTable ( org.apache.hadoop.hbase.TableName p1, java.util.concurrent.ExecutorService p2 ) [abstract] : Table
[mangled: org/apache/hadoop/hbase/client/Connection.getTable:(Lorg/apache/hadoop/hbase/TableName;Ljava/util/concurrent/ExecutorService;)Lorg/apache/hadoop/hbase/client/Table;]
Connection.isClosed ( ) [abstract] : boolean
[mangled: org/apache/hadoop/hbase/client/Connection.isClosed:()Z]
hbase-client-1.0.0.jar,
Consistency.class
package org.apache.hadoop.hbase.client
Consistency.valueOf ( String name ) [static] : Consistency
[mangled: org/apache/hadoop/hbase/client/Consistency.valueOf:(Ljava/lang/String;)Lorg/apache/hadoop/hbase/client/Consistency;]
Consistency.values ( ) [static] : Consistency[ ]
[mangled: org/apache/hadoop/hbase/client/Consistency.values:()[Lorg/apache/hadoop/hbase/client/Consistency;]
hbase-client-1.0.0.jar,
FuzzyRowFilter.class
package org.apache.hadoop.hbase.filter
FuzzyRowFilter.transformCell ( org.apache.hadoop.hbase.Cell v ) : org.apache.hadoop.hbase.Cell
[mangled: org/apache/hadoop/hbase/filter/FuzzyRowFilter.transformCell:(Lorg/apache/hadoop/hbase/Cell;)Lorg/apache/hadoop/hbase/Cell;]
hbase-client-1.0.0.jar,
HConnection.class
package org.apache.hadoop.hbase.client
HConnection.getAdmin ( ) [abstract] : Admin
[mangled: org/apache/hadoop/hbase/client/HConnection.getAdmin:()Lorg/apache/hadoop/hbase/client/Admin;]
HConnection.getRegionLocator ( org.apache.hadoop.hbase.TableName p1 ) [abstract] : RegionLocator
[mangled: org/apache/hadoop/hbase/client/HConnection.getRegionLocator:(Lorg/apache/hadoop/hbase/TableName;)Lorg/apache/hadoop/hbase/client/RegionLocator;]
HConnection.updateCachedLocations ( org.apache.hadoop.hbase.TableName p1, byte[ ] p2, byte[ ] p3, Object p4, org.apache.hadoop.hbase.ServerName p5 ) [abstract] : void *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/HConnection.updateCachedLocations:(Lorg/apache/hadoop/hbase/TableName;[B[BLjava/lang/Object;Lorg/apache/hadoop/hbase/ServerName;)V]
hbase-client-1.0.0.jar,
HRegionInfo.class
package org.apache.hadoop.hbase
HRegionInfo.createRegionName ( TableName tableName, byte[ ] startKey, byte[ ] id, int replicaId, boolean newFormat ) [static] : byte[ ]
[mangled: org/apache/hadoop/hbase/HRegionInfo.createRegionName:(Lorg/apache/hadoop/hbase/TableName;[B[BIZ)[B]
HRegionInfo.createRegionName ( TableName tableName, byte[ ] startKey, long regionid, int replicaId, boolean newFormat ) [static] : byte[ ]
[mangled: org/apache/hadoop/hbase/HRegionInfo.createRegionName:(Lorg/apache/hadoop/hbase/TableName;[BJIZ)[B]
HRegionInfo.getReplicaId ( ) : int
[mangled: org/apache/hadoop/hbase/HRegionInfo.getReplicaId:()I]
HRegionInfo.HRegionInfo ( HRegionInfo other, int replicaId )
[mangled: org/apache/hadoop/hbase/HRegionInfo."<init>":(Lorg/apache/hadoop/hbase/HRegionInfo;I)V]
HRegionInfo.HRegionInfo ( TableName tableName, byte[ ] startKey, byte[ ] endKey, boolean split, long regionid, int replicaId )
[mangled: org/apache/hadoop/hbase/HRegionInfo."<init>":(Lorg/apache/hadoop/hbase/TableName;[B[BZJI)V]
HRegionInfo.isSystemTable ( ) : boolean
[mangled: org/apache/hadoop/hbase/HRegionInfo.isSystemTable:()Z]
hbase-client-1.0.0.jar,
PrefixFilter.class
package org.apache.hadoop.hbase.filter
PrefixFilter.transformCell ( org.apache.hadoop.hbase.Cell v ) : org.apache.hadoop.hbase.Cell
[mangled: org/apache/hadoop/hbase/filter/PrefixFilter.transformCell:(Lorg/apache/hadoop/hbase/Cell;)Lorg/apache/hadoop/hbase/Cell;]
hbase-client-1.0.0.jar,
Put.class
package org.apache.hadoop.hbase.client
Put.setACL ( java.util.Map x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Put.setACL:(Ljava/util/Map;)Lorg/apache/hadoop/hbase/client/Mutation;]
Put.setACL ( java.util.Map<String,org.apache.hadoop.hbase.security.access.Permission> perms ) : Put
[mangled: org/apache/hadoop/hbase/client/Put.setACL:(Ljava/util/Map;)Lorg/apache/hadoop/hbase/client/Put;]
Put.setACL ( String user, org.apache.hadoop.hbase.security.access.Permission perms ) : Put
[mangled: org/apache/hadoop/hbase/client/Put.setACL:(Ljava/lang/String;Lorg/apache/hadoop/hbase/security/access/Permission;)Lorg/apache/hadoop/hbase/client/Put;]
Put.setACL ( String x0, org.apache.hadoop.hbase.security.access.Permission x1 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Put.setACL:(Ljava/lang/String;Lorg/apache/hadoop/hbase/security/access/Permission;)Lorg/apache/hadoop/hbase/client/Mutation;]
Put.setAttribute ( String name, byte[ ] value ) : Put
[mangled: org/apache/hadoop/hbase/client/Put.setAttribute:(Ljava/lang/String;[B)Lorg/apache/hadoop/hbase/client/Put;]
Put.setAttribute ( String x0, byte[ ] x1 ) : Attributes
[mangled: org/apache/hadoop/hbase/client/Put.setAttribute:(Ljava/lang/String;[B)Lorg/apache/hadoop/hbase/client/Attributes;]
Put.setAttribute ( String x0, byte[ ] x1 ) : OperationWithAttributes
[mangled: org/apache/hadoop/hbase/client/Put.setAttribute:(Ljava/lang/String;[B)Lorg/apache/hadoop/hbase/client/OperationWithAttributes;]
Put.setCellVisibility ( org.apache.hadoop.hbase.security.visibility.CellVisibility expression ) : Put
[mangled: org/apache/hadoop/hbase/client/Put.setCellVisibility:(Lorg/apache/hadoop/hbase/security/visibility/CellVisibility;)Lorg/apache/hadoop/hbase/client/Put;]
Put.setCellVisibility ( org.apache.hadoop.hbase.security.visibility.CellVisibility x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Put.setCellVisibility:(Lorg/apache/hadoop/hbase/security/visibility/CellVisibility;)Lorg/apache/hadoop/hbase/client/Mutation;]
Put.setClusterIds ( java.util.List x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Put.setClusterIds:(Ljava/util/List;)Lorg/apache/hadoop/hbase/client/Mutation;]
Put.setClusterIds ( java.util.List<java.util.UUID> clusterIds ) : Put
[mangled: org/apache/hadoop/hbase/client/Put.setClusterIds:(Ljava/util/List;)Lorg/apache/hadoop/hbase/client/Put;]
Put.setDurability ( Durability d ) : Put
[mangled: org/apache/hadoop/hbase/client/Put.setDurability:(Lorg/apache/hadoop/hbase/client/Durability;)Lorg/apache/hadoop/hbase/client/Put;]
Put.setDurability ( Durability x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Put.setDurability:(Lorg/apache/hadoop/hbase/client/Durability;)Lorg/apache/hadoop/hbase/client/Mutation;]
Put.setFamilyCellMap ( java.util.NavigableMap x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Put.setFamilyCellMap:(Ljava/util/NavigableMap;)Lorg/apache/hadoop/hbase/client/Mutation;]
Put.setFamilyCellMap ( java.util.NavigableMap<byte[ ],java.util.List<org.apache.hadoop.hbase.Cell>> map ) : Put
[mangled: org/apache/hadoop/hbase/client/Put.setFamilyCellMap:(Ljava/util/NavigableMap;)Lorg/apache/hadoop/hbase/client/Put;]
Put.setFamilyMap ( java.util.NavigableMap x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Put.setFamilyMap:(Ljava/util/NavigableMap;)Lorg/apache/hadoop/hbase/client/Mutation;]
Put.setFamilyMap ( java.util.NavigableMap<byte[ ],java.util.List<org.apache.hadoop.hbase.KeyValue>> map ) : Put *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/Put.setFamilyMap:(Ljava/util/NavigableMap;)Lorg/apache/hadoop/hbase/client/Put;]
Put.setId ( String id ) : Put
[mangled: org/apache/hadoop/hbase/client/Put.setId:(Ljava/lang/String;)Lorg/apache/hadoop/hbase/client/Put;]
Put.setId ( String x0 ) : OperationWithAttributes
[mangled: org/apache/hadoop/hbase/client/Put.setId:(Ljava/lang/String;)Lorg/apache/hadoop/hbase/client/OperationWithAttributes;]
Put.setTTL ( long ttl ) : Put
[mangled: org/apache/hadoop/hbase/client/Put.setTTL:(J)Lorg/apache/hadoop/hbase/client/Put;]
Put.setTTL ( long x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Put.setTTL:(J)Lorg/apache/hadoop/hbase/client/Mutation;]
Put.setWriteToWAL ( boolean write ) : Put *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/Put.setWriteToWAL:(Z)Lorg/apache/hadoop/hbase/client/Put;]
Put.setWriteToWAL ( boolean x0 ) : Mutation
[mangled: org/apache/hadoop/hbase/client/Put.setWriteToWAL:(Z)Lorg/apache/hadoop/hbase/client/Mutation;]
hbase-client-1.0.0.jar,
Result.class
package org.apache.hadoop.hbase.client
Result.create ( java.util.List<org.apache.hadoop.hbase.Cell> cells, Boolean exists, boolean stale ) [static] : Result
[mangled: org/apache/hadoop/hbase/client/Result.create:(Ljava/util/List;Ljava/lang/Boolean;Z)Lorg/apache/hadoop/hbase/client/Result;]
Result.create ( org.apache.hadoop.hbase.Cell[ ] cells, Boolean exists, boolean stale ) [static] : Result
[mangled: org/apache/hadoop/hbase/client/Result.create:([Lorg/apache/hadoop/hbase/Cell;Ljava/lang/Boolean;Z)Lorg/apache/hadoop/hbase/client/Result;]
Result.getTotalSizeOfCells ( Result result ) [static] : long
[mangled: org/apache/hadoop/hbase/client/Result.getTotalSizeOfCells:(Lorg/apache/hadoop/hbase/client/Result;)J]
Result.isStale ( ) : boolean
[mangled: org/apache/hadoop/hbase/client/Result.isStale:()Z]
hbase-client-1.0.0.jar,
Scan.class
package org.apache.hadoop.hbase.client
Scan.setACL ( java.util.Map x0 ) : Query
[mangled: org/apache/hadoop/hbase/client/Scan.setACL:(Ljava/util/Map;)Lorg/apache/hadoop/hbase/client/Query;]
Scan.setACL ( java.util.Map<String,org.apache.hadoop.hbase.security.access.Permission> perms ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setACL:(Ljava/util/Map;)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setACL ( String user, org.apache.hadoop.hbase.security.access.Permission perms ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setACL:(Ljava/lang/String;Lorg/apache/hadoop/hbase/security/access/Permission;)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setACL ( String x0, org.apache.hadoop.hbase.security.access.Permission x1 ) : Query
[mangled: org/apache/hadoop/hbase/client/Scan.setACL:(Ljava/lang/String;Lorg/apache/hadoop/hbase/security/access/Permission;)Lorg/apache/hadoop/hbase/client/Query;]
Scan.setAttribute ( String name, byte[ ] value ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setAttribute:(Ljava/lang/String;[B)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setAttribute ( String x0, byte[ ] x1 ) : Attributes
[mangled: org/apache/hadoop/hbase/client/Scan.setAttribute:(Ljava/lang/String;[B)Lorg/apache/hadoop/hbase/client/Attributes;]
Scan.setAttribute ( String x0, byte[ ] x1 ) : OperationWithAttributes
[mangled: org/apache/hadoop/hbase/client/Scan.setAttribute:(Ljava/lang/String;[B)Lorg/apache/hadoop/hbase/client/OperationWithAttributes;]
Scan.setAuthorizations ( org.apache.hadoop.hbase.security.visibility.Authorizations authorizations ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setAuthorizations:(Lorg/apache/hadoop/hbase/security/visibility/Authorizations;)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setAuthorizations ( org.apache.hadoop.hbase.security.visibility.Authorizations x0 ) : Query
[mangled: org/apache/hadoop/hbase/client/Scan.setAuthorizations:(Lorg/apache/hadoop/hbase/security/visibility/Authorizations;)Lorg/apache/hadoop/hbase/client/Query;]
Scan.setConsistency ( Consistency consistency ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setConsistency:(Lorg/apache/hadoop/hbase/client/Consistency;)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setConsistency ( Consistency x0 ) : Query
[mangled: org/apache/hadoop/hbase/client/Scan.setConsistency:(Lorg/apache/hadoop/hbase/client/Consistency;)Lorg/apache/hadoop/hbase/client/Query;]
Scan.setId ( String id ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setId:(Ljava/lang/String;)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setId ( String x0 ) : OperationWithAttributes
[mangled: org/apache/hadoop/hbase/client/Scan.setId:(Ljava/lang/String;)Lorg/apache/hadoop/hbase/client/OperationWithAttributes;]
Scan.setIsolationLevel ( IsolationLevel level ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setIsolationLevel:(Lorg/apache/hadoop/hbase/client/IsolationLevel;)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setIsolationLevel ( IsolationLevel x0 ) : Query
[mangled: org/apache/hadoop/hbase/client/Scan.setIsolationLevel:(Lorg/apache/hadoop/hbase/client/IsolationLevel;)Lorg/apache/hadoop/hbase/client/Query;]
Scan.setReplicaId ( int Id ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setReplicaId:(I)Lorg/apache/hadoop/hbase/client/Scan;]
Scan.setReplicaId ( int x0 ) : Query
[mangled: org/apache/hadoop/hbase/client/Scan.setReplicaId:(I)Lorg/apache/hadoop/hbase/client/Query;]
Scan.setRowPrefixFilter ( byte[ ] rowPrefix ) : Scan
[mangled: org/apache/hadoop/hbase/client/Scan.setRowPrefixFilter:([B)Lorg/apache/hadoop/hbase/client/Scan;]
hbase-client-1.0.0.jar,
Table.class
package org.apache.hadoop.hbase.client
Table.append ( Append p1 ) [abstract] : Result
[mangled: org/apache/hadoop/hbase/client/Table.append:(Lorg/apache/hadoop/hbase/client/Append;)Lorg/apache/hadoop/hbase/client/Result;]
Table.batch ( java.util.List<? extends Row> p1 ) [abstract] : Object[ ] *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/Table.batch:(Ljava/util/List;)[Ljava/lang/Object;]
Table.batch ( java.util.List<? extends Row> p1, Object[ ] p2 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.batch:(Ljava/util/List;[Ljava/lang/Object;)V]
Table.batchCallback ( java.util.List<? extends Row> p1, Object[ ] p2, coprocessor.Batch.Callback<R> p3 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.batchCallback:(Ljava/util/List;[Ljava/lang/Object;Lorg/apache/hadoop/hbase/client/coprocessor/Batch$Callback;)V]
Table.batchCallback ( java.util.List<? extends Row> p1, coprocessor.Batch.Callback<R> p2 ) [abstract] : Object[ ] *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/Table.batchCallback:(Ljava/util/List;Lorg/apache/hadoop/hbase/client/coprocessor/Batch$Callback;)[Ljava/lang/Object;]
Table.batchCoprocessorService ( com.google.protobuf.Descriptors.MethodDescriptor p1, com.google.protobuf.Message p2, byte[ ] p3, byte[ ] p4, R p5 ) [abstract] : java.util.Map<byte[ ],R>
[mangled: org/apache/hadoop/hbase/client/Table.batchCoprocessorService:(Lcom/google/protobuf/Descriptors$MethodDescriptor;Lcom/google/protobuf/Message;[B[BLcom/google/protobuf/Message;)Ljava/util/Map;]
Table.batchCoprocessorService ( com.google.protobuf.Descriptors.MethodDescriptor p1, com.google.protobuf.Message p2, byte[ ] p3, byte[ ] p4, R p5, coprocessor.Batch.Callback<R> p6 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.batchCoprocessorService:(Lcom/google/protobuf/Descriptors$MethodDescriptor;Lcom/google/protobuf/Message;[B[BLcom/google/protobuf/Message;Lorg/apache/hadoop/hbase/client/coprocessor/Batch$Callback;)V]
Table.checkAndDelete ( byte[ ] p1, byte[ ] p2, byte[ ] p3, byte[ ] p4, Delete p5 ) [abstract] : boolean
[mangled: org/apache/hadoop/hbase/client/Table.checkAndDelete:([B[B[B[BLorg/apache/hadoop/hbase/client/Delete;)Z]
Table.checkAndDelete ( byte[ ] p1, byte[ ] p2, byte[ ] p3, org.apache.hadoop.hbase.filter.CompareFilter.CompareOp p4, byte[ ] p5, Delete p6 ) [abstract] : boolean
[mangled: org/apache/hadoop/hbase/client/Table.checkAndDelete:([B[B[BLorg/apache/hadoop/hbase/filter/CompareFilter$CompareOp;[BLorg/apache/hadoop/hbase/client/Delete;)Z]
Table.checkAndMutate ( byte[ ] p1, byte[ ] p2, byte[ ] p3, org.apache.hadoop.hbase.filter.CompareFilter.CompareOp p4, byte[ ] p5, RowMutations p6 ) [abstract] : boolean
[mangled: org/apache/hadoop/hbase/client/Table.checkAndMutate:([B[B[BLorg/apache/hadoop/hbase/filter/CompareFilter$CompareOp;[BLorg/apache/hadoop/hbase/client/RowMutations;)Z]
Table.checkAndPut ( byte[ ] p1, byte[ ] p2, byte[ ] p3, byte[ ] p4, Put p5 ) [abstract] : boolean
[mangled: org/apache/hadoop/hbase/client/Table.checkAndPut:([B[B[B[BLorg/apache/hadoop/hbase/client/Put;)Z]
Table.checkAndPut ( byte[ ] p1, byte[ ] p2, byte[ ] p3, org.apache.hadoop.hbase.filter.CompareFilter.CompareOp p4, byte[ ] p5, Put p6 ) [abstract] : boolean
[mangled: org/apache/hadoop/hbase/client/Table.checkAndPut:([B[B[BLorg/apache/hadoop/hbase/filter/CompareFilter$CompareOp;[BLorg/apache/hadoop/hbase/client/Put;)Z]
Table.close ( ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.close:()V]
Table.coprocessorService ( byte[ ] p1 ) [abstract] : org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel
[mangled: org/apache/hadoop/hbase/client/Table.coprocessorService:([B)Lorg/apache/hadoop/hbase/ipc/CoprocessorRpcChannel;]
Table.coprocessorService ( Class<T> p1, byte[ ] p2, byte[ ] p3, coprocessor.Batch.Call<T,R> p4 ) [abstract] : java.util.Map<byte[ ],R>
[mangled: org/apache/hadoop/hbase/client/Table.coprocessorService:(Ljava/lang/Class;[B[BLorg/apache/hadoop/hbase/client/coprocessor/Batch$Call;)Ljava/util/Map;]
Table.coprocessorService ( Class<T> p1, byte[ ] p2, byte[ ] p3, coprocessor.Batch.Call<T,R> p4, coprocessor.Batch.Callback<R> p5 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.coprocessorService:(Ljava/lang/Class;[B[BLorg/apache/hadoop/hbase/client/coprocessor/Batch$Call;Lorg/apache/hadoop/hbase/client/coprocessor/Batch$Callback;)V]
Table.delete ( java.util.List<Delete> p1 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.delete:(Ljava/util/List;)V]
Table.delete ( Delete p1 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.delete:(Lorg/apache/hadoop/hbase/client/Delete;)V]
Table.exists ( Get p1 ) [abstract] : boolean
[mangled: org/apache/hadoop/hbase/client/Table.exists:(Lorg/apache/hadoop/hbase/client/Get;)Z]
Table.existsAll ( java.util.List<Get> p1 ) [abstract] : boolean[ ]
[mangled: org/apache/hadoop/hbase/client/Table.existsAll:(Ljava/util/List;)[Z]
Table.flushCommits ( ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.flushCommits:()V]
Table.get ( java.util.List<Get> p1 ) [abstract] : Result[ ]
[mangled: org/apache/hadoop/hbase/client/Table.get:(Ljava/util/List;)[Lorg/apache/hadoop/hbase/client/Result;]
Table.get ( Get p1 ) [abstract] : Result
[mangled: org/apache/hadoop/hbase/client/Table.get:(Lorg/apache/hadoop/hbase/client/Get;)Lorg/apache/hadoop/hbase/client/Result;]
Table.getConfiguration ( ) [abstract] : org.apache.hadoop.conf.Configuration
[mangled: org/apache/hadoop/hbase/client/Table.getConfiguration:()Lorg/apache/hadoop/conf/Configuration;]
Table.getName ( ) [abstract] : org.apache.hadoop.hbase.TableName
[mangled: org/apache/hadoop/hbase/client/Table.getName:()Lorg/apache/hadoop/hbase/TableName;]
Table.getScanner ( byte[ ] p1 ) [abstract] : ResultScanner
[mangled: org/apache/hadoop/hbase/client/Table.getScanner:([B)Lorg/apache/hadoop/hbase/client/ResultScanner;]
Table.getScanner ( byte[ ] p1, byte[ ] p2 ) [abstract] : ResultScanner
[mangled: org/apache/hadoop/hbase/client/Table.getScanner:([B[B)Lorg/apache/hadoop/hbase/client/ResultScanner;]
Table.getScanner ( Scan p1 ) [abstract] : ResultScanner
[mangled: org/apache/hadoop/hbase/client/Table.getScanner:(Lorg/apache/hadoop/hbase/client/Scan;)Lorg/apache/hadoop/hbase/client/ResultScanner;]
Table.getTableDescriptor ( ) [abstract] : org.apache.hadoop.hbase.HTableDescriptor
[mangled: org/apache/hadoop/hbase/client/Table.getTableDescriptor:()Lorg/apache/hadoop/hbase/HTableDescriptor;]
Table.getWriteBufferSize ( ) [abstract] : long
[mangled: org/apache/hadoop/hbase/client/Table.getWriteBufferSize:()J]
Table.increment ( Increment p1 ) [abstract] : Result
[mangled: org/apache/hadoop/hbase/client/Table.increment:(Lorg/apache/hadoop/hbase/client/Increment;)Lorg/apache/hadoop/hbase/client/Result;]
Table.incrementColumnValue ( byte[ ] p1, byte[ ] p2, byte[ ] p3, long p4 ) [abstract] : long
[mangled: org/apache/hadoop/hbase/client/Table.incrementColumnValue:([B[B[BJ)J]
Table.incrementColumnValue ( byte[ ] p1, byte[ ] p2, byte[ ] p3, long p4, Durability p5 ) [abstract] : long
[mangled: org/apache/hadoop/hbase/client/Table.incrementColumnValue:([B[B[BJLorg/apache/hadoop/hbase/client/Durability;)J]
Table.isAutoFlush ( ) [abstract] : boolean
[mangled: org/apache/hadoop/hbase/client/Table.isAutoFlush:()Z]
Table.mutateRow ( RowMutations p1 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.mutateRow:(Lorg/apache/hadoop/hbase/client/RowMutations;)V]
Table.put ( java.util.List<Put> p1 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.put:(Ljava/util/List;)V]
Table.put ( Put p1 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.put:(Lorg/apache/hadoop/hbase/client/Put;)V]
Table.setAutoFlushTo ( boolean p1 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.setAutoFlushTo:(Z)V]
Table.setWriteBufferSize ( long p1 ) [abstract] : void
[mangled: org/apache/hadoop/hbase/client/Table.setWriteBufferSize:(J)V]
hbase-client-1.0.0.jar,
TokenUtil.class
package org.apache.hadoop.hbase.security.token
TokenUtil.addTokenForJob ( org.apache.hadoop.hbase.client.Connection conn, org.apache.hadoop.hbase.security.User user, org.apache.hadoop.mapreduce.Job job ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.addTokenForJob:(Lorg/apache/hadoop/hbase/client/Connection;Lorg/apache/hadoop/hbase/security/User;Lorg/apache/hadoop/mapreduce/Job;)V]
TokenUtil.addTokenForJob ( org.apache.hadoop.hbase.client.Connection conn, org.apache.hadoop.mapred.JobConf job, org.apache.hadoop.hbase.security.User user ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.addTokenForJob:(Lorg/apache/hadoop/hbase/client/Connection;Lorg/apache/hadoop/mapred/JobConf;Lorg/apache/hadoop/hbase/security/User;)V]
TokenUtil.addTokenIfMissing ( org.apache.hadoop.hbase.client.Connection conn, org.apache.hadoop.hbase.security.User user ) [static] : boolean
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.addTokenIfMissing:(Lorg/apache/hadoop/hbase/client/Connection;Lorg/apache/hadoop/hbase/security/User;)Z]
TokenUtil.obtainAndCacheToken ( org.apache.hadoop.hbase.client.Connection conn, org.apache.hadoop.hbase.security.User user ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainAndCacheToken:(Lorg/apache/hadoop/hbase/client/Connection;Lorg/apache/hadoop/hbase/security/User;)V]
TokenUtil.obtainToken ( org.apache.hadoop.hbase.client.Connection conn ) [static] : org.apache.hadoop.security.token.Token<AuthenticationTokenIdentifier>
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainToken:(Lorg/apache/hadoop/hbase/client/Connection;)Lorg/apache/hadoop/security/token/Token;]
TokenUtil.obtainToken ( org.apache.hadoop.hbase.client.Connection conn, org.apache.hadoop.hbase.security.User user ) [static] : org.apache.hadoop.security.token.Token<AuthenticationTokenIdentifier>
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainToken:(Lorg/apache/hadoop/hbase/client/Connection;Lorg/apache/hadoop/hbase/security/User;)Lorg/apache/hadoop/security/token/Token;]
TokenUtil.obtainTokenForJob ( org.apache.hadoop.hbase.client.Connection conn, org.apache.hadoop.hbase.security.User user, org.apache.hadoop.mapreduce.Job job ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainTokenForJob:(Lorg/apache/hadoop/hbase/client/Connection;Lorg/apache/hadoop/hbase/security/User;Lorg/apache/hadoop/mapreduce/Job;)V]
TokenUtil.obtainTokenForJob ( org.apache.hadoop.hbase.client.Connection conn, org.apache.hadoop.mapred.JobConf job, org.apache.hadoop.hbase.security.User user ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainTokenForJob:(Lorg/apache/hadoop/hbase/client/Connection;Lorg/apache/hadoop/mapred/JobConf;Lorg/apache/hadoop/hbase/security/User;)V]
hbase-common-1.0.0.jar,
ByteBufferUtils.class
package org.apache.hadoop.hbase.util
ByteBufferUtils.compareTo ( java.nio.ByteBuffer buf1, int o1, int len1, java.nio.ByteBuffer buf2, int o2, int len2 ) [static] : int
[mangled: org/apache/hadoop/hbase/util/ByteBufferUtils.compareTo:(Ljava/nio/ByteBuffer;IILjava/nio/ByteBuffer;II)I]
ByteBufferUtils.copyFromBufferToBuffer ( java.nio.ByteBuffer out, java.nio.ByteBuffer in, int sourceOffset, int destinationOffset, int length ) [static] : void
[mangled: org/apache/hadoop/hbase/util/ByteBufferUtils.copyFromBufferToBuffer:(Ljava/nio/ByteBuffer;Ljava/nio/ByteBuffer;III)V]
ByteBufferUtils.toBytes ( java.nio.ByteBuffer buffer, int offset, int length ) [static] : byte[ ]
[mangled: org/apache/hadoop/hbase/util/ByteBufferUtils.toBytes:(Ljava/nio/ByteBuffer;II)[B]
hbase-common-1.0.0.jar,
CellUtil.class
package org.apache.hadoop.hbase
CellUtil.createCell ( byte[ ] row ) [static] : Cell
[mangled: org/apache/hadoop/hbase/CellUtil.createCell:([B)Lorg/apache/hadoop/hbase/Cell;]
CellUtil.createCell ( byte[ ] row, byte[ ] family, byte[ ] qualifier ) [static] : Cell
[mangled: org/apache/hadoop/hbase/CellUtil.createCell:([B[B[B)Lorg/apache/hadoop/hbase/Cell;]
CellUtil.createCell ( byte[ ] row, byte[ ] value ) [static] : Cell
[mangled: org/apache/hadoop/hbase/CellUtil.createCell:([B[B)Lorg/apache/hadoop/hbase/Cell;]
CellUtil.createCell ( byte[ ] rowArray, int rowOffset, int rowLength, byte[ ] familyArray, int familyOffset, int familyLength, byte[ ] qualifierArray, int qualifierOffset, int qualifierLength ) [static] : Cell
[mangled: org/apache/hadoop/hbase/CellUtil.createCell:([BII[BII[BII)Lorg/apache/hadoop/hbase/Cell;]
CellUtil.estimatedHeapSizeOf ( Cell cell ) [static] : long
[mangled: org/apache/hadoop/hbase/CellUtil.estimatedHeapSizeOf:(Lorg/apache/hadoop/hbase/Cell;)J]
CellUtil.estimatedSerializedSizeOf ( Cell cell ) [static] : int
[mangled: org/apache/hadoop/hbase/CellUtil.estimatedSerializedSizeOf:(Lorg/apache/hadoop/hbase/Cell;)I]
CellUtil.estimatedSerializedSizeOfKey ( Cell cell ) [static] : int
[mangled: org/apache/hadoop/hbase/CellUtil.estimatedSerializedSizeOfKey:(Lorg/apache/hadoop/hbase/Cell;)I]
CellUtil.findCommonPrefixInFlatKey ( Cell c1, Cell c2, boolean bypassFamilyCheck, boolean withTsType ) [static] : int
[mangled: org/apache/hadoop/hbase/CellUtil.findCommonPrefixInFlatKey:(Lorg/apache/hadoop/hbase/Cell;Lorg/apache/hadoop/hbase/Cell;ZZ)I]
CellUtil.getCellKeyAsString ( Cell cell ) [static] : String
[mangled: org/apache/hadoop/hbase/CellUtil.getCellKeyAsString:(Lorg/apache/hadoop/hbase/Cell;)Ljava/lang/String;]
CellUtil.getCellKeySerializedAsKeyValueKey ( Cell cell ) [static] : byte[ ]
[mangled: org/apache/hadoop/hbase/CellUtil.getCellKeySerializedAsKeyValueKey:(Lorg/apache/hadoop/hbase/Cell;)[B]
CellUtil.isDelete ( byte type ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.isDelete:(B)Z]
CellUtil.isDeleteColumnOrFamily ( Cell cell ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.isDeleteColumnOrFamily:(Lorg/apache/hadoop/hbase/Cell;)Z]
CellUtil.isDeleteFamilyVersion ( Cell cell ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.isDeleteFamilyVersion:(Lorg/apache/hadoop/hbase/Cell;)Z]
CellUtil.isDeleteType ( Cell cell ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.isDeleteType:(Lorg/apache/hadoop/hbase/Cell;)Z]
CellUtil.matchingColumn ( Cell left, byte[ ] fam, int foffset, int flength, byte[ ] qual, int qoffset, int qlength ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.matchingColumn:(Lorg/apache/hadoop/hbase/Cell;[BII[BII)Z]
CellUtil.matchingFamily ( Cell left, byte[ ] buf, int offset, int length ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.matchingFamily:(Lorg/apache/hadoop/hbase/Cell;[BII)Z]
CellUtil.matchingQualifier ( Cell left, byte[ ] buf, int offset, int length ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.matchingQualifier:(Lorg/apache/hadoop/hbase/Cell;[BII)Z]
CellUtil.matchingRow ( Cell left, byte[ ] buf, int offset, int length ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.matchingRow:(Lorg/apache/hadoop/hbase/Cell;[BII)Z]
CellUtil.setSequenceId ( Cell cell, long seqId ) [static] : void
[mangled: org/apache/hadoop/hbase/CellUtil.setSequenceId:(Lorg/apache/hadoop/hbase/Cell;J)V]
CellUtil.setTimestamp ( Cell cell, byte[ ] ts, int tsOffset ) [static] : void
[mangled: org/apache/hadoop/hbase/CellUtil.setTimestamp:(Lorg/apache/hadoop/hbase/Cell;[BI)V]
CellUtil.setTimestamp ( Cell cell, long ts ) [static] : void
[mangled: org/apache/hadoop/hbase/CellUtil.setTimestamp:(Lorg/apache/hadoop/hbase/Cell;J)V]
CellUtil.updateLatestStamp ( Cell cell, byte[ ] ts, int tsOffset ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.updateLatestStamp:(Lorg/apache/hadoop/hbase/Cell;[BI)Z]
CellUtil.updateLatestStamp ( Cell cell, long ts ) [static] : boolean
[mangled: org/apache/hadoop/hbase/CellUtil.updateLatestStamp:(Lorg/apache/hadoop/hbase/Cell;J)Z]
CellUtil.writeFlatKey ( Cell cell, java.io.DataOutputStream out ) [static] : void
[mangled: org/apache/hadoop/hbase/CellUtil.writeFlatKey:(Lorg/apache/hadoop/hbase/Cell;Ljava/io/DataOutputStream;)V]
CellUtil.writeRowKeyExcludingCommon ( Cell cell, short rLen, int commonPrefix, java.io.DataOutputStream out ) [static] : void
[mangled: org/apache/hadoop/hbase/CellUtil.writeRowKeyExcludingCommon:(Lorg/apache/hadoop/hbase/Cell;SILjava/io/DataOutputStream;)V]
hbase-common-1.0.0.jar,
Counter.class
package org.apache.hadoop.hbase.util
Counter.add ( long delta ) : void
[mangled: org/apache/hadoop/hbase/util/Counter.add:(J)V]
Counter.Counter ( )
[mangled: org/apache/hadoop/hbase/util/Counter."<init>":()V]
Counter.Counter ( long initValue )
[mangled: org/apache/hadoop/hbase/util/Counter."<init>":(J)V]
Counter.decrement ( ) : void
[mangled: org/apache/hadoop/hbase/util/Counter.decrement:()V]
Counter.get ( ) : long
[mangled: org/apache/hadoop/hbase/util/Counter.get:()J]
Counter.increment ( ) : void
[mangled: org/apache/hadoop/hbase/util/Counter.increment:()V]
Counter.set ( long value ) : void
[mangled: org/apache/hadoop/hbase/util/Counter.set:(J)V]
Counter.toString ( ) : String
[mangled: org/apache/hadoop/hbase/util/Counter.toString:()Ljava/lang/String;]
hbase-server-1.0.0.jar,
HFileOutputFormat2.class
package org.apache.hadoop.hbase.mapreduce
HFileOutputFormat2.configureIncrementalLoad ( org.apache.hadoop.mapreduce.Job job, org.apache.hadoop.hbase.client.Table table, org.apache.hadoop.hbase.client.RegionLocator regionLocator ) [static] : void
[mangled: org/apache/hadoop/hbase/mapreduce/HFileOutputFormat2.configureIncrementalLoad:(Lorg/apache/hadoop/mapreduce/Job;Lorg/apache/hadoop/hbase/client/Table;Lorg/apache/hadoop/hbase/client/RegionLocator;)V]
HFileOutputFormat2.configureIncrementalLoadMap ( org.apache.hadoop.mapreduce.Job job, org.apache.hadoop.hbase.client.Table table ) [static] : void
[mangled: org/apache/hadoop/hbase/mapreduce/HFileOutputFormat2.configureIncrementalLoadMap:(Lorg/apache/hadoop/mapreduce/Job;Lorg/apache/hadoop/hbase/client/Table;)V]
hbase-server-1.0.0.jar,
RowTooBigException.class
package org.apache.hadoop.hbase.regionserver
RowTooBigException.RowTooBigException ( String message )
[mangled: org/apache/hadoop/hbase/regionserver/RowTooBigException."<init>":(Ljava/lang/String;)V]
hbase-server-1.0.0.jar,
TableInputFormat.class
package org.apache.hadoop.hbase.mapreduce
TableInputFormat.getSplits ( org.apache.hadoop.mapreduce.JobContext context ) : java.util.List<org.apache.hadoop.mapreduce.InputSplit>
[mangled: org/apache/hadoop/hbase/mapreduce/TableInputFormat.getSplits:(Lorg/apache/hadoop/mapreduce/JobContext;)Ljava/util/List;]
TableInputFormat.initialize ( ) : void
[mangled: org/apache/hadoop/hbase/mapreduce/TableInputFormat.initialize:()V]
hbase-server-1.0.0.jar,
TableInputFormatBase.class
package org.apache.hadoop.hbase.mapreduce
TableInputFormatBase.closeTable ( ) : void
[mangled: org/apache/hadoop/hbase/mapreduce/TableInputFormatBase.closeTable:()V]
TableInputFormatBase.getAdmin ( ) : org.apache.hadoop.hbase.client.Admin
[mangled: org/apache/hadoop/hbase/mapreduce/TableInputFormatBase.getAdmin:()Lorg/apache/hadoop/hbase/client/Admin;]
TableInputFormatBase.getRegionLocator ( ) : org.apache.hadoop.hbase.client.RegionLocator
[mangled: org/apache/hadoop/hbase/mapreduce/TableInputFormatBase.getRegionLocator:()Lorg/apache/hadoop/hbase/client/RegionLocator;]
TableInputFormatBase.getTable ( ) : org.apache.hadoop.hbase.client.Table
[mangled: org/apache/hadoop/hbase/mapreduce/TableInputFormatBase.getTable:()Lorg/apache/hadoop/hbase/client/Table;]
TableInputFormatBase.initialize ( ) : void
[mangled: org/apache/hadoop/hbase/mapreduce/TableInputFormatBase.initialize:()V]
TableInputFormatBase.initializeTable ( org.apache.hadoop.hbase.client.Connection connection, org.apache.hadoop.hbase.TableName tableName ) : void
[mangled: org/apache/hadoop/hbase/mapreduce/TableInputFormatBase.initializeTable:(Lorg/apache/hadoop/hbase/client/Connection;Lorg/apache/hadoop/hbase/TableName;)V]
hbase-server-1.0.0.jar,
TableRecordReader.class
package org.apache.hadoop.hbase.mapreduce
TableRecordReader.setHTable ( org.apache.hadoop.hbase.client.Table htable ) : void *DEPRECATED*
[mangled: org/apache/hadoop/hbase/mapreduce/TableRecordReader.setHTable:(Lorg/apache/hadoop/hbase/client/Table;)V]
TableRecordReader.setTable ( org.apache.hadoop.hbase.client.Table table ) : void
[mangled: org/apache/hadoop/hbase/mapreduce/TableRecordReader.setTable:(Lorg/apache/hadoop/hbase/client/Table;)V]
to the top
Removed Methods (34)
hbase-client-0.98.9.jar,
ClientScanner.class
package org.apache.hadoop.hbase.client
ClientScanner.ClientScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, byte[ ] tableName ) *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/ClientScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;[B)V]
ClientScanner.ClientScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, byte[ ] tableName, HConnection connection ) *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/ClientScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;[BLorg/apache/hadoop/hbase/client/HConnection;)V]
ClientScanner.ClientScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, org.apache.hadoop.hbase.TableName tableName ) *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/ClientScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/TableName;)V]
ClientScanner.ClientScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, org.apache.hadoop.hbase.TableName tableName, HConnection connection )
[mangled: org/apache/hadoop/hbase/client/ClientScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/HConnection;)V]
ClientScanner.ClientScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, org.apache.hadoop.hbase.TableName tableName, HConnection connection, RpcRetryingCallerFactory rpcFactory ) *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/ClientScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/client/RpcRetryingCallerFactory;)V]
ClientScanner.ClientScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, org.apache.hadoop.hbase.TableName tableName, HConnection connection, RpcRetryingCallerFactory rpcFactory, org.apache.hadoop.hbase.ipc.RpcControllerFactory controllerFactory )
[mangled: org/apache/hadoop/hbase/client/ClientScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/client/RpcRetryingCallerFactory;Lorg/apache/hadoop/hbase/ipc/RpcControllerFactory;)V]
ClientScanner.getConnection ( ) : HConnection
[mangled: org/apache/hadoop/hbase/client/ClientScanner.getConnection:()Lorg/apache/hadoop/hbase/client/HConnection;]
ClientScanner.getScannerCallable ( byte[ ] localStartKey, int nbRows ) : ScannerCallable
[mangled: org/apache/hadoop/hbase/client/ClientScanner.getScannerCallable:([BI)Lorg/apache/hadoop/hbase/client/ScannerCallable;]
hbase-client-0.98.9.jar,
ClientSmallScanner.class
package org.apache.hadoop.hbase.client
ClientSmallScanner.ClientSmallScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, org.apache.hadoop.hbase.TableName tableName )
[mangled: org/apache/hadoop/hbase/client/ClientSmallScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/TableName;)V]
ClientSmallScanner.ClientSmallScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, org.apache.hadoop.hbase.TableName tableName, HConnection connection )
[mangled: org/apache/hadoop/hbase/client/ClientSmallScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/HConnection;)V]
ClientSmallScanner.ClientSmallScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, org.apache.hadoop.hbase.TableName tableName, HConnection connection, RpcRetryingCallerFactory rpcFactory, org.apache.hadoop.hbase.ipc.RpcControllerFactory controllerFactory )
[mangled: org/apache/hadoop/hbase/client/ClientSmallScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/client/RpcRetryingCallerFactory;Lorg/apache/hadoop/hbase/ipc/RpcControllerFactory;)V]
hbase-client-0.98.9.jar,
Filter.class
package org.apache.hadoop.hbase.filter
Filter.filterRow ( java.util.List<org.apache.hadoop.hbase.KeyValue> p1 ) [abstract] : void *DEPRECATED*
[mangled: org/apache/hadoop/hbase/filter/Filter.filterRow:(Ljava/util/List;)V]
hbase-client-0.98.9.jar,
FilterList.class
package org.apache.hadoop.hbase.filter
FilterList.filterRow ( java.util.List<org.apache.hadoop.hbase.KeyValue> kvs ) : void *DEPRECATED*
[mangled: org/apache/hadoop/hbase/filter/FilterList.filterRow:(Ljava/util/List;)V]
hbase-client-0.98.9.jar,
Get.class
package org.apache.hadoop.hbase.client
Get.setCacheBlocks ( boolean cacheBlocks ) : void
[mangled: org/apache/hadoop/hbase/client/Get.setCacheBlocks:(Z)V]
Get.setCheckExistenceOnly ( boolean checkExistenceOnly ) : void
[mangled: org/apache/hadoop/hbase/client/Get.setCheckExistenceOnly:(Z)V]
Get.setClosestRowBefore ( boolean closestRowBefore ) : void
[mangled: org/apache/hadoop/hbase/client/Get.setClosestRowBefore:(Z)V]
hbase-client-0.98.9.jar,
HTable.class
package org.apache.hadoop.hbase.client
HTable.HTable ( byte[ ] tableName, HConnection connection, java.util.concurrent.ExecutorService pool )
[mangled: org/apache/hadoop/hbase/client/HTable."<init>":([BLorg/apache/hadoop/hbase/client/HConnection;Ljava/util/concurrent/ExecutorService;)V]
HTable.HTable ( org.apache.hadoop.hbase.TableName tableName, HConnection connection )
[mangled: org/apache/hadoop/hbase/client/HTable."<init>":(Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/HConnection;)V]
HTable.HTable ( org.apache.hadoop.hbase.TableName tableName, HConnection connection, java.util.concurrent.ExecutorService pool )
[mangled: org/apache/hadoop/hbase/client/HTable."<init>":(Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/HConnection;Ljava/util/concurrent/ExecutorService;)V]
HTable.HTable ( org.apache.hadoop.hbase.TableName tableName, HConnection connection, TableConfiguration tableConfig, RpcRetryingCallerFactory rpcCallerFactory, org.apache.hadoop.hbase.ipc.RpcControllerFactory rpcControllerFactory, java.util.concurrent.ExecutorService pool )
[mangled: org/apache/hadoop/hbase/client/HTable."<init>":(Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/client/TableConfiguration;Lorg/apache/hadoop/hbase/client/RpcRetryingCallerFactory;Lorg/apache/hadoop/hbase/ipc/RpcControllerFactory;Ljava/util/concurrent/ExecutorService;)V]
HTable.main ( String[ ] args ) [static] : void
[mangled: org/apache/hadoop/hbase/client/HTable.main:([Ljava/lang/String;)V]
hbase-client-0.98.9.jar,
ReversedClientScanner.class
package org.apache.hadoop.hbase.client
ReversedClientScanner.getScannerCallable ( byte[ ] localStartKey, int nbRows, byte[ ] locateStartRow ) : ScannerCallable
[mangled: org/apache/hadoop/hbase/client/ReversedClientScanner.getScannerCallable:([BI[B)Lorg/apache/hadoop/hbase/client/ScannerCallable;]
ReversedClientScanner.ReversedClientScanner ( org.apache.hadoop.conf.Configuration conf, Scan scan, org.apache.hadoop.hbase.TableName tableName, HConnection connection )
[mangled: org/apache/hadoop/hbase/client/ReversedClientScanner."<init>":(Lorg/apache/hadoop/conf/Configuration;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/HConnection;)V]
hbase-client-0.98.9.jar,
ReversedScannerCallable.class
package org.apache.hadoop.hbase.client
ReversedScannerCallable.ReversedScannerCallable ( HConnection connection, org.apache.hadoop.hbase.TableName tableName, Scan scan, metrics.ScanMetrics scanMetrics, byte[ ] locateStartRow ) *DEPRECATED*
[mangled: org/apache/hadoop/hbase/client/ReversedScannerCallable."<init>":(Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/client/metrics/ScanMetrics;[B)V]
ReversedScannerCallable.ReversedScannerCallable ( HConnection connection, org.apache.hadoop.hbase.TableName tableName, Scan scan, metrics.ScanMetrics scanMetrics, byte[ ] locateStartRow, org.apache.hadoop.hbase.ipc.PayloadCarryingRpcController rpcFactory )
[mangled: org/apache/hadoop/hbase/client/ReversedScannerCallable."<init>":(Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/TableName;Lorg/apache/hadoop/hbase/client/Scan;Lorg/apache/hadoop/hbase/client/metrics/ScanMetrics;[BLorg/apache/hadoop/hbase/ipc/PayloadCarryingRpcController;)V]
hbase-client-0.98.9.jar,
TokenUtil.class
package org.apache.hadoop.hbase.security.token
TokenUtil.addTokenForJob ( org.apache.hadoop.hbase.client.HConnection conn, org.apache.hadoop.hbase.security.User user, org.apache.hadoop.mapreduce.Job job ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.addTokenForJob:(Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/security/User;Lorg/apache/hadoop/mapreduce/Job;)V]
TokenUtil.addTokenForJob ( org.apache.hadoop.hbase.client.HConnection conn, org.apache.hadoop.mapred.JobConf job, org.apache.hadoop.hbase.security.User user ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.addTokenForJob:(Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/mapred/JobConf;Lorg/apache/hadoop/hbase/security/User;)V]
TokenUtil.addTokenIfMissing ( org.apache.hadoop.hbase.client.HConnection conn, org.apache.hadoop.hbase.security.User user ) [static] : boolean
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.addTokenIfMissing:(Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/security/User;)Z]
TokenUtil.obtainAndCacheToken ( org.apache.hadoop.hbase.client.HConnection conn, org.apache.hadoop.hbase.security.User user ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainAndCacheToken:(Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/security/User;)V]
TokenUtil.obtainToken ( org.apache.hadoop.hbase.client.HConnection conn ) [static] : org.apache.hadoop.security.token.Token<AuthenticationTokenIdentifier>
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainToken:(Lorg/apache/hadoop/hbase/client/HConnection;)Lorg/apache/hadoop/security/token/Token;]
TokenUtil.obtainToken ( org.apache.hadoop.hbase.client.HConnection conn, org.apache.hadoop.hbase.security.User user ) [static] : org.apache.hadoop.security.token.Token<AuthenticationTokenIdentifier>
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainToken:(Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/security/User;)Lorg/apache/hadoop/security/token/Token;]
TokenUtil.obtainTokenForJob ( org.apache.hadoop.hbase.client.HConnection conn, org.apache.hadoop.hbase.security.User user, org.apache.hadoop.mapreduce.Job job ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainTokenForJob:(Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/hbase/security/User;Lorg/apache/hadoop/mapreduce/Job;)V]
TokenUtil.obtainTokenForJob ( org.apache.hadoop.hbase.client.HConnection conn, org.apache.hadoop.mapred.JobConf job, org.apache.hadoop.hbase.security.User user ) [static] : void
[mangled: org/apache/hadoop/hbase/security/token/TokenUtil.obtainTokenForJob:(Lorg/apache/hadoop/hbase/client/HConnection;Lorg/apache/hadoop/mapred/JobConf;Lorg/apache/hadoop/hbase/security/User;)V]
hbase-server-0.98.9.jar,
HFileOutputFormat2.class
package org.apache.hadoop.hbase.mapreduce
HFileOutputFormat2.configureIncrementalLoadMap ( org.apache.hadoop.mapreduce.Job job, org.apache.hadoop.hbase.client.HTable table ) [static] : void
[mangled: org/apache/hadoop/hbase/mapreduce/HFileOutputFormat2.configureIncrementalLoadMap:(Lorg/apache/hadoop/mapreduce/Job;Lorg/apache/hadoop/hbase/client/HTable;)V]
to the top
Problems with Data Types, High Severity (29)
hbase-client-0.98.9.jar
package org.apache.hadoop.hbase.client
[+] Attributes (2)
| Change | Effect |
|---|
| 1 | Abstract method setAttribute ( java.lang.String, byte[ ] ) has been added to this interface. | Recompilation of a client program may be terminated with the message: a client class C is not abstract and does not override abstract method setAttribute(java.lang.String, byte[]) in Attributes. |
| 2 | Abstract method setAttribute ( java.lang.String, byte[ ] ) has been removed from this interface. | Recompilation of a client program may be terminated with the message: cannot find method setAttribute(java.lang.String, byte[]) in interface Attributes. |
[+] affected methods (2)
getAttribute ( java.lang.String )This abstract method is from 'Attributes' interface.
getAttributesMap ( )This abstract method is from 'Attributes' interface.
[+] ClientScanner (1)
| Change | Effect |
|---|
| 1 | Type of field callable has been changed from ScannerCallable to ScannerCallableWithReplicas. | Recompilation of a client program may be terminated with the message: incompatible types, found: ScannerCallable, required: ScannerCallableWithReplicas. |
[+] affected methods (10)
checkScanStopRow ( byte[ ] )This method is from 'ClientScanner' class.
close ( )This method is from 'ClientScanner' class.
getScan ( )This method is from 'ClientScanner' class.
getTable ( )This method is from 'ClientScanner' class.
getTableName ( )This method is from 'ClientScanner' class.
getTimestamp ( )This method is from 'ClientScanner' class.
initializeScannerInConstruction ( )This method is from 'ClientScanner' class.
next ( )This method is from 'ClientScanner' class.
nextScanner ( int, boolean )This method is from 'ClientScanner' class.
writeScanMetrics ( )This method is from 'ClientScanner' class.
[+] HConnection (6)
| Change | Effect |
|---|
| 1 | Abstract method getAdmin ( ) has been added to this interface. | Recompilation of a client program may be terminated with the message: a client class C is not abstract and does not override abstract method getAdmin() in HConnection. |
| 2 | Abstract method getRegionLocator ( org.apache.hadoop.hbase.TableName ) has been added to this interface. | Recompilation of a client program may be terminated with the message: a client class C is not abstract and does not override abstract method getRegionLocator(org.apache.hadoop.hbase.TableName) in HConnection. |
| 3 | Abstract method updateCachedLocations ( org.apache.hadoop.hbase.TableName, byte[ ], byte[ ], java.lang.Object, org.apache.hadoop.hbase.ServerName ) has been added to this interface. | Recompilation of a client program may be terminated with the message: a client class C is not abstract and does not override abstract method updateCachedLocations(org.apache.hadoop.hbase.TableName, byte[], byte[], java.lang.Object, org.apache.hadoop.hbase.ServerName) in HConnection. |
| 4 | Added super-interface Connection. | Recompilation of a client program may be terminated with the message: a client class C is not abstract and does not override abstract method in Connection. |
| 5 | Removed super-interface java.io.Closeable. | Recompilation of a client program may be terminated with the message: cannot find method in interface HConnection. |
| 6 | Removed super-interface org.apache.hadoop.hbase.Abortable. | Recompilation of a client program may be terminated with the message: cannot find method in interface HConnection. |
[+] affected methods (59)
clearCaches ( org.apache.hadoop.hbase.ServerName )This abstract method is from 'HConnection' interface.
clearRegionCache ( )This abstract method is from 'HConnection' interface.
clearRegionCache ( byte[ ] )This abstract method is from 'HConnection' interface.
clearRegionCache ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
deleteCachedRegionLocation ( org.apache.hadoop.hbase.HRegionLocation )This abstract method is from 'HConnection' interface.
getAdmin ( org.apache.hadoop.hbase.ServerName )This abstract method is from 'HConnection' interface.
getAdmin ( org.apache.hadoop.hbase.ServerName, boolean )This abstract method is from 'HConnection' interface.
getClient ( org.apache.hadoop.hbase.ServerName )This abstract method is from 'HConnection' interface.
getConfiguration ( )This abstract method is from 'HConnection' interface.
getCurrentNrHRS ( )This abstract method is from 'HConnection' interface.
getHTableDescriptor ( byte[ ] )This abstract method is from 'HConnection' interface.
getHTableDescriptor ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
getHTableDescriptors ( java.util.List<java.lang.String> )This abstract method is from 'HConnection' interface.
getHTableDescriptorsByTableName ( java.util.List<org.apache.hadoop.hbase.TableName> )This abstract method is from 'HConnection' interface.
getKeepAliveMasterService ( )This abstract method is from 'HConnection' interface.
getMaster ( )This abstract method is from 'HConnection' interface.
getNonceGenerator ( )This abstract method is from 'HConnection' interface.
getRegionCachePrefetch ( byte[ ] )This abstract method is from 'HConnection' interface.
getRegionCachePrefetch ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
getRegionLocation ( byte[ ], byte[ ], boolean )This abstract method is from 'HConnection' interface.
getRegionLocation ( org.apache.hadoop.hbase.TableName, byte[ ], boolean )This abstract method is from 'HConnection' interface.
getTable ( byte[ ] )This abstract method is from 'HConnection' interface.
getTable ( byte[ ], java.util.concurrent.ExecutorService )This abstract method is from 'HConnection' interface.
getTable ( java.lang.String )This abstract method is from 'HConnection' interface.
getTable ( java.lang.String, java.util.concurrent.ExecutorService )This abstract method is from 'HConnection' interface.
getTable ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
getTable ( org.apache.hadoop.hbase.TableName, java.util.concurrent.ExecutorService )This abstract method is from 'HConnection' interface.
getTableNames ( )This abstract method is from 'HConnection' interface.
isClosed ( )This abstract method is from 'HConnection' interface.
isDeadServer ( org.apache.hadoop.hbase.ServerName )This abstract method is from 'HConnection' interface.
isMasterRunning ( )This abstract method is from 'HConnection' interface.
isTableAvailable ( byte[ ] )This abstract method is from 'HConnection' interface.
isTableAvailable ( byte[ ], byte[ ][ ] )This abstract method is from 'HConnection' interface.
isTableAvailable ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
isTableAvailable ( org.apache.hadoop.hbase.TableName, byte[ ][ ] )This abstract method is from 'HConnection' interface.
isTableDisabled ( byte[ ] )This abstract method is from 'HConnection' interface.
isTableDisabled ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
isTableEnabled ( byte[ ] )This abstract method is from 'HConnection' interface.
isTableEnabled ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
listTableNames ( )This abstract method is from 'HConnection' interface.
listTables ( )This abstract method is from 'HConnection' interface.
locateRegion ( byte[ ] )This abstract method is from 'HConnection' interface.
locateRegion ( byte[ ], byte[ ] )This abstract method is from 'HConnection' interface.
locateRegion ( org.apache.hadoop.hbase.TableName, byte[ ] )This abstract method is from 'HConnection' interface.
locateRegions ( byte[ ] )This abstract method is from 'HConnection' interface.
locateRegions ( byte[ ], boolean, boolean )This abstract method is from 'HConnection' interface.
locateRegions ( org.apache.hadoop.hbase.TableName )This abstract method is from 'HConnection' interface.
locateRegions ( org.apache.hadoop.hbase.TableName, boolean, boolean )This abstract method is from 'HConnection' interface.
processBatch ( java.util.List<? extends Row>, byte[ ], java.util.concurrent.ExecutorService, java.lang.Object[ ] )This abstract method is from 'HConnection' interface.
processBatch ( java.util.List<? extends Row>, org.apache.hadoop.hbase.TableName, java.util.concurrent.ExecutorService, java.lang.Object[ ] )This abstract method is from 'HConnection' interface.
processBatchCallback ( java.util.List<? extends Row>, byte[ ], java.util.concurrent.ExecutorService, java.lang.Object[ ], coprocessor.Batch.Callback<R> )This abstract method is from 'HConnection' interface.
processBatchCallback ( java.util.List<? extends Row>, org.apache.hadoop.hbase.TableName, java.util.concurrent.ExecutorService, java.lang.Object[ ], coprocessor.Batch.Callback<R> )This abstract method is from 'HConnection' interface.
relocateRegion ( byte[ ], byte[ ] )This abstract method is from 'HConnection' interface.
relocateRegion ( org.apache.hadoop.hbase.TableName, byte[ ] )This abstract method is from 'HConnection' interface.
setRegionCachePrefetch ( byte[ ], boolean )This abstract method is from 'HConnection' interface.
setRegionCachePrefetch ( org.apache.hadoop.hbase.TableName, boolean )This abstract method is from 'HConnection' interface.
updateCachedLocations ( byte[ ], byte[ ], java.lang.Object, org.apache.hadoop.hbase.HRegionLocation )This abstract method is from 'HConnection' interface.
updateCachedLocations ( org.apache.hadoop.hbase.TableName, byte[ ], java.lang.Object, org.apache.hadoop.hbase.HRegionLocation )This abstract method is from 'HConnection' interface.
getConnection ( )Return value of this method has type 'HConnection'.
[+] HTable (2)
| Change | Effect |
|---|
| 1 | Type of field ap has been changed from AsyncProcess<java.lang.Object> to AsyncProcess. | Recompilation of a client program may be terminated with the message: incompatible types, found: AsyncProcess<java.lang.Object>, required: AsyncProcess. |
| 2 | Type of field connection has been changed from HConnection to ClusterConnection. | Recompilation of a client program may be terminated with the message: incompatible types, found: HConnection, required: ClusterConnection. |
[+] affected methods (86)
append ( Append )This method is from 'HTable' class.
batch ( java.util.List<? extends Row> )This method is from 'HTable' class.
batch ( java.util.List<? extends Row>, java.lang.Object[ ] )This method is from 'HTable' class.
batchCallback ( java.util.List<? extends Row>, java.lang.Object[ ], coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
batchCallback ( java.util.List<? extends Row>, coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
batchCoprocessorService ( com.google.protobuf.Descriptors.MethodDescriptor, com.google.protobuf.Message, byte[ ], byte[ ], R )This method is from 'HTable' class.
batchCoprocessorService ( com.google.protobuf.Descriptors.MethodDescriptor, com.google.protobuf.Message, byte[ ], byte[ ], R, coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
checkAndDelete ( byte[ ], byte[ ], byte[ ], byte[ ], Delete )This method is from 'HTable' class.
checkAndMutate ( byte[ ], byte[ ], byte[ ], org.apache.hadoop.hbase.filter.CompareFilter.CompareOp, byte[ ], RowMutations )This method is from 'HTable' class.
checkAndPut ( byte[ ], byte[ ], byte[ ], byte[ ], Put )This method is from 'HTable' class.
clearRegionCache ( )This method is from 'HTable' class.
close ( )This method is from 'HTable' class.
coprocessorService ( byte[ ] )This method is from 'HTable' class.
coprocessorService ( java.lang.Class<T>, byte[ ], byte[ ], coprocessor.Batch.Call<T,R> )This method is from 'HTable' class.
coprocessorService ( java.lang.Class<T>, byte[ ], byte[ ], coprocessor.Batch.Call<T,R>, coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
delete ( java.util.List<Delete> )This method is from 'HTable' class.
delete ( Delete )This method is from 'HTable' class.
exists ( java.util.List<Get> )This method is from 'HTable' class.
exists ( Get )This method is from 'HTable' class.
flushCommits ( )This method is from 'HTable' class.
get ( java.util.List<Get> )This method is from 'HTable' class.
get ( Get )This method is from 'HTable' class.
getConfiguration ( )This method is from 'HTable' class.
getConnection ( )This method is from 'HTable' class.
getDefaultExecutor ( org.apache.hadoop.conf.Configuration )This method is from 'HTable' class.
getEndKeys ( )This method is from 'HTable' class.
getMaxKeyValueSize ( org.apache.hadoop.conf.Configuration )This method is from 'HTable' class.
getName ( )This method is from 'HTable' class.
getOperationTimeout ( )This method is from 'HTable' class.
getRegionCachePrefetch ( byte[ ] )This method is from 'HTable' class.
getRegionCachePrefetch ( org.apache.hadoop.conf.Configuration, byte[ ] )This method is from 'HTable' class.
getRegionCachePrefetch ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName )This method is from 'HTable' class.
getRegionCachePrefetch ( org.apache.hadoop.hbase.TableName )This method is from 'HTable' class.
getRegionLocation ( byte[ ] )This method is from 'HTable' class.
getRegionLocation ( byte[ ], boolean )This method is from 'HTable' class.
getRegionLocation ( java.lang.String )This method is from 'HTable' class.
getRegionLocations ( )This method is from 'HTable' class.
getRegionsInRange ( byte[ ], byte[ ] )This method is from 'HTable' class.
getRegionsInRange ( byte[ ], byte[ ], boolean )This method is from 'HTable' class.
getRowOrBefore ( byte[ ], byte[ ] )This method is from 'HTable' class.
getScanner ( byte[ ] )This method is from 'HTable' class.
getScanner ( byte[ ], byte[ ] )This method is from 'HTable' class.
getScanner ( Scan )This method is from 'HTable' class.
getScannerCaching ( )This method is from 'HTable' class.
getStartEndKeys ( )This method is from 'HTable' class.
getStartKeys ( )This method is from 'HTable' class.
getTableDescriptor ( )This method is from 'HTable' class.
getTableName ( )This method is from 'HTable' class.
getWriteBuffer ( )This method is from 'HTable' class.
getWriteBufferSize ( )This method is from 'HTable' class.
HTable ( )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, byte[ ] )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, byte[ ], java.util.concurrent.ExecutorService )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, java.lang.String )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName, java.util.concurrent.ExecutorService )This constructor is from 'HTable' class.
increment ( Increment )This method is from 'HTable' class.
incrementColumnValue ( byte[ ], byte[ ], byte[ ], long )This method is from 'HTable' class.
incrementColumnValue ( byte[ ], byte[ ], byte[ ], long, boolean )This method is from 'HTable' class.
incrementColumnValue ( byte[ ], byte[ ], byte[ ], long, Durability )This method is from 'HTable' class.
isAutoFlush ( )This method is from 'HTable' class.
isTableEnabled ( byte[ ] )This method is from 'HTable' class.
isTableEnabled ( java.lang.String )This method is from 'HTable' class.
isTableEnabled ( org.apache.hadoop.conf.Configuration, byte[ ] )This method is from 'HTable' class.
isTableEnabled ( org.apache.hadoop.conf.Configuration, java.lang.String )This method is from 'HTable' class.
isTableEnabled ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName )This method is from 'HTable' class.
isTableEnabled ( org.apache.hadoop.hbase.TableName )This method is from 'HTable' class.
mutateRow ( RowMutations )This method is from 'HTable' class.
processBatch ( java.util.List<? extends Row>, java.lang.Object[ ] )This method is from 'HTable' class.
processBatchCallback ( java.util.List<? extends Row>, java.lang.Object[ ], coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
put ( java.util.List<Put> )This method is from 'HTable' class.
put ( Put )This method is from 'HTable' class.
setAutoFlush ( boolean )This method is from 'HTable' class.
setAutoFlush ( boolean, boolean )This method is from 'HTable' class.
setAutoFlushTo ( boolean )This method is from 'HTable' class.
setOperationTimeout ( int )This method is from 'HTable' class.
setRegionCachePrefetch ( byte[ ], boolean )This method is from 'HTable' class.
setRegionCachePrefetch ( org.apache.hadoop.conf.Configuration, byte[ ], boolean )This method is from 'HTable' class.
setRegionCachePrefetch ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName, boolean )This method is from 'HTable' class.
setRegionCachePrefetch ( org.apache.hadoop.hbase.TableName, boolean )This method is from 'HTable' class.
setScannerCaching ( int )This method is from 'HTable' class.
setWriteBufferSize ( long )This method is from 'HTable' class.
toString ( )This method is from 'HTable' class.
validatePut ( Put )This method is from 'HTable' class.
validatePut ( Put, int )This method is from 'HTable' class.
configureIncrementalLoad ( org.apache.hadoop.mapreduce.Job, HTable )2nd parameter 'table' of this method has type 'HTable'.
[+] HTableInterface (2)
| Change | Effect |
|---|
| 1 | Added super-interface Table. | Recompilation of a client program may be terminated with the message: a client class C is not abstract and does not override abstract method in Table. |
| 2 | Removed super-interface java.io.Closeable. | Recompilation of a client program may be terminated with the message: cannot find method in interface HTableInterface. |
[+] affected methods (10)
getTable ( byte[ ] )Return value of this abstract method has type 'HTableInterface'.
getTable ( byte[ ], java.util.concurrent.ExecutorService )Return value of this abstract method has type 'HTableInterface'.
getTable ( java.lang.String )Return value of this abstract method has type 'HTableInterface'.
getTable ( java.lang.String, java.util.concurrent.ExecutorService )Return value of this abstract method has type 'HTableInterface'.
getTable ( org.apache.hadoop.hbase.TableName )Return value of this abstract method has type 'HTableInterface'.
getTable ( org.apache.hadoop.hbase.TableName, java.util.concurrent.ExecutorService )Return value of this abstract method has type 'HTableInterface'.
createHTableInterface ( org.apache.hadoop.conf.Configuration, byte[ ] )Return value of this method has type 'HTableInterface'.
releaseHTableInterface ( HTableInterface )1st parameter 'table' of this method has type 'HTableInterface'.
createHTableInterface ( org.apache.hadoop.conf.Configuration, byte[ ] )Return value of this abstract method has type 'HTableInterface'.
releaseHTableInterface ( HTableInterface )1st parameter 'p1' of this abstract method has type 'HTableInterface'.
package org.apache.hadoop.hbase.filter
[+] Filter (1)
| Change | Effect |
|---|
| 1 | Abstract method filterRow ( java.util.List<org.apache.hadoop.hbase.KeyValue> ) has been removed from this class. | Recompilation of a client program may be terminated with the message: cannot find method filterRow(java.util.List<org.apache.hadoop.hbase.KeyValue>) in class Filter. |
[+] affected methods (30)
setFilter ( Filter )1st parameter 'filter' of this method has type 'Filter'.
setFilter ( Filter )Field 'retval.filter' in return value of this method has type 'Filter'.
Filter ( )This constructor is from 'Filter' abstract class.
filterAllRemaining ( )This abstract method is from 'Filter' abstract class.
filterKeyValue ( org.apache.hadoop.hbase.Cell )This abstract method is from 'Filter' abstract class.
filterRow ( )This abstract method is from 'Filter' abstract class.
filterRowCells ( java.util.List<org.apache.hadoop.hbase.Cell> )This abstract method is from 'Filter' abstract class.
filterRowKey ( byte[ ], int, int )This abstract method is from 'Filter' abstract class.
getNextCellHint ( org.apache.hadoop.hbase.Cell )This abstract method is from 'Filter' abstract class.
getNextKeyHint ( org.apache.hadoop.hbase.KeyValue )This abstract method is from 'Filter' abstract class.
hasFilterRow ( )This abstract method is from 'Filter' abstract class.
isFamilyEssential ( byte[ ] )This abstract method is from 'Filter' abstract class.
isReversed ( )This method is from 'Filter' abstract class.
parseFrom ( byte[ ] )This method is from 'Filter' abstract class.
reset ( )This abstract method is from 'Filter' abstract class.
setReversed ( boolean )This method is from 'Filter' abstract class.
toByteArray ( )This abstract method is from 'Filter' abstract class.
transform ( org.apache.hadoop.hbase.KeyValue )This abstract method is from 'Filter' abstract class.
transformCell ( org.apache.hadoop.hbase.Cell )This abstract method is from 'Filter' abstract class.
addFilter ( Filter )1st parameter 'filter' of this method has type 'Filter'.
parseFilterString ( byte[ ] )Return value of this method has type 'Filter'.
parseFilterString ( java.lang.String )Return value of this method has type 'Filter'.
parseSimpleFilterExpression ( byte[ ] )Return value of this method has type 'Filter'.
popArguments ( java.util.Stack<java.nio.ByteBuffer>, java.util.Stack<Filter> )Return value of this method has type 'Filter'.
createFilterFromArguments ( java.util.ArrayList<byte[ ]> )Return value of this method has type 'Filter'.
createFilterFromArguments ( java.util.ArrayList<byte[ ]> )Return value of this method has type 'Filter'.
getFilter ( )Return value of this method has type 'Filter'.
SkipFilter ( Filter )1st parameter 'filter' of this method has type 'Filter'.
filterKv ( Filter, org.apache.hadoop.hbase.Cell )1st parameter 'filter' of this method has type 'Filter'.
instantiateFilter ( org.apache.hadoop.conf.Configuration )Return value of this method has type 'Filter'.
hbase-server-0.98.9.jar
package org.apache.hadoop.hbase.master
[+] HMaster (2)
| Change | Effect |
|---|
| 1 | Removed super-interface org.apache.hadoop.hbase.protobuf.generated.MasterProtos.MasterService.BlockingInterface. | Recompilation of a client program may be terminated with the message: cannot find method in class HMaster. |
| 2 | Removed super-interface org.apache.hadoop.hbase.protobuf.generated.RegionServerStatusProtos.RegionServerStatusService.BlockingInterface. | Recompilation of a client program may be terminated with the message: cannot find method in class HMaster. |
[+] affected methods (2)
getActiveMaster ( )Return value of this method has type 'HMaster'.
getMaster ( int )Return value of this method has type 'HMaster'.
package org.apache.hadoop.hbase.regionserver
[+] HRegionServer (13)
| Change | Effect |
|---|
| 1 | Access level of field abortRequested has been changed from protected to private. | Recompilation of a client program may be terminated with the message: abortRequested has private access in HRegionServer. |
| 2 | Access level of field stopped has been changed from protected to private. | Recompilation of a client program may be terminated with the message: stopped has private access in HRegionServer. |
| 3 | Removed super-interface java.lang.Runnable. | Recompilation of a client program may be terminated with the message: cannot find method in class HRegionServer. |
| 4 | Removed super-interface org.apache.hadoop.hbase.ipc.HBaseRPCErrorHandler. | Recompilation of a client program may be terminated with the message: cannot find method in class HRegionServer. |
| 5 | Removed super-interface org.apache.hadoop.hbase.protobuf.generated.AdminProtos.AdminService.BlockingInterface. | Recompilation of a client program may be terminated with the message: cannot find method in class HRegionServer. |
| 6 | Removed super-interface org.apache.hadoop.hbase.protobuf.generated.ClientProtos.ClientService.BlockingInterface. | Recompilation of a client program may be terminated with the message: cannot find method in class HRegionServer. |
| 7 | Field REGIONSERVER_CONF (java.lang.String) with the compile-time constant value "regionserver_conf" has been removed from this class. | Recompilation of a client program may be terminated with the message: cannot find variable REGIONSERVER_CONF in HRegionServer. |
| 8 | Field REGION_SERVER_RPC_SCHEDULER_FACTORY_CLASS (java.lang.String) with the compile-time constant value "hbase.region.server.rpc.scheduler.factory.class" has been removed from this class. | Recompilation of a client program may be terminated with the message: cannot find variable REGION_SERVER_RPC_SCHEDULER_FACTORY_CLASS in HRegionServer. |
| 9 | Field catalogTracker of type org.apache.hadoop.hbase.catalog.CatalogTracker has been removed from this class. | Recompilation of a client program may be terminated with the message: cannot find variable catalogTracker in HRegionServer. |
| 10 | Field hlog of type wal.HLog has been removed from this class. | Recompilation of a client program may be terminated with the message: cannot find variable hlog in HRegionServer. |
| 11 | Field hlogForMeta of type wal.HLog has been removed from this class. | Recompilation of a client program may be terminated with the message: cannot find variable hlogForMeta in HRegionServer. |
| 12 | Field isOnline of type boolean has been removed from this class. | Recompilation of a client program may be terminated with the message: cannot find variable isOnline in HRegionServer. |
| 13 | Field maxScannerResultSize of type long has been removed from this class. | Recompilation of a client program may be terminated with the message: cannot find variable maxScannerResultSize in HRegionServer. |
[+] affected methods (1)
getRegionServer ( int )Return value of this method has type 'HRegionServer'.
to the top
Problems with Methods, High Severity (1)
hbase-common-0.98.9.jar,
AuthUtil
package org.apache.hadoop.hbase
[+] AuthUtil.AuthUtil ( ) (1)
[mangled: org/apache/hadoop/hbase/AuthUtil."<init>":()V]
| Change | Effect |
|---|
| 1 | Access level has been changed from public to private. | Recompilation of a client program may be terminated with the message: AuthUtil() has private access in org.apache.hadoop.hbase.AuthUtil. |
to the top
Problems with Data Types, Medium Severity (2)
hbase-client-0.98.9.jar
package org.apache.hadoop.hbase.client
[+] NoServerForRegionException (1)
| Change | Effect |
|---|
| 1 | Superclass has been changed from org.apache.hadoop.hbase.RegionException to DoNotRetryRegionException. | 1) Recompilation of a client program may be terminated with the message: cannot find variable (or method) in NoServerForRegionException. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class. Recompilation of a client class may be terminated with the message: reference to variable is ambiguous. |
[+] affected methods (2)
NoServerForRegionException ( )This constructor is from 'NoServerForRegionException' class.
NoServerForRegionException ( java.lang.String )This constructor is from 'NoServerForRegionException' class.
hbase-server-0.98.9.jar
package org.apache.hadoop.hbase.master
[+] HMaster (1)
| Change | Effect |
|---|
| 1 | Superclass has been changed from org.apache.hadoop.hbase.util.HasThread to org.apache.hadoop.hbase.regionserver.HRegionServer. | 1) Recompilation of a client program may be terminated with the message: cannot find variable (or method) in HMaster. 2) A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class. Recompilation of a client class may be terminated with the message: reference to variable is ambiguous. |
[+] affected methods (2)
getActiveMaster ( )Return value of this method has type 'HMaster'.
getMaster ( int )Return value of this method has type 'HMaster'.
to the top
Problems with Data Types, Low Severity (2)
hbase-server-0.98.9.jar
package org.apache.hadoop.hbase.regionserver
[+] HRegionServer (1)
| Change | Effect |
|---|
| 1 | Added super-class org.apache.hadoop.hbase.util.HasThread. | A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class. Recompilation of a client class may be terminated with the message: reference to variable is ambiguous. |
[+] affected methods (1)
getRegionServer ( int )Return value of this method has type 'HRegionServer'.
package org.apache.hadoop.hbase.regionserver.wal
[+] HLogPrettyPrinter (1)
| Change | Effect |
|---|
| 1 | Added super-class org.apache.hadoop.hbase.wal.WALPrettyPrinter. | A static field from a super-interface of a client class may hide a field (with the same name) inherited from new super-class. Recompilation of a client class may be terminated with the message: reference to variable is ambiguous. |
[+] affected methods (3)
HLogPrettyPrinter ( )This constructor is from 'HLogPrettyPrinter' class.
HLogPrettyPrinter ( boolean, boolean, long, java.lang.String, java.lang.String, boolean, java.io.PrintStream )This constructor is from 'HLogPrettyPrinter' class.
main ( java.lang.String[ ] )This method is from 'HLogPrettyPrinter' class.
to the top
Other Changes in Data Types (30)
hbase-client-0.98.9.jar
package org.apache.hadoop.hbase
[+] HRegionInfo (2)
| Change | Effect |
|---|
| 1 | Field REPLICA_ID_DELIMITER has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to REPLICA_ID_DELIMITER is ambiguous. |
| 2 | Field REPLICA_ID_FORMAT has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to REPLICA_ID_FORMAT is ambiguous. |
[+] affected methods (75)
checkScanStopRow ( byte[ ] )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
close ( )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
getScan ( )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
getTable ( )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
getTableName ( )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
getTimestamp ( )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
initializeScannerInConstruction ( )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
next ( )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
nextScanner ( int, boolean )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
writeScanMetrics ( )Field 'this.currentRegion' in the object of this method has type 'HRegionInfo'.
areAdjacent ( HRegionInfo, HRegionInfo )This method is from 'HRegionInfo' class.
compareTo ( java.lang.Object )This method is from 'HRegionInfo' class.
compareTo ( HRegionInfo )This method is from 'HRegionInfo' class.
containsRange ( byte[ ], byte[ ] )This method is from 'HRegionInfo' class.
containsRow ( byte[ ] )This method is from 'HRegionInfo' class.
convert ( HRegionInfo )This method is from 'HRegionInfo' class.
convert ( protobuf.generated.HBaseProtos.RegionInfo )This method is from 'HRegionInfo' class.
createRegionName ( TableName, byte[ ], byte[ ], boolean )This method is from 'HRegionInfo' class.
createRegionName ( TableName, byte[ ], java.lang.String, boolean )This method is from 'HRegionInfo' class.
createRegionName ( TableName, byte[ ], long, boolean )This method is from 'HRegionInfo' class.
encodeRegionName ( byte[ ] )This method is from 'HRegionInfo' class.
equals ( java.lang.Object )This method is from 'HRegionInfo' class.
getComparator ( )This method is from 'HRegionInfo' class.
getDaughterRegions ( client.Result )This method is from 'HRegionInfo' class.
getEncodedName ( )This method is from 'HRegionInfo' class.
getEncodedNameAsBytes ( )This method is from 'HRegionInfo' class.
getEndKey ( )This method is from 'HRegionInfo' class.
getHRegionInfo ( client.Result )This method is from 'HRegionInfo' class.
getHRegionInfo ( client.Result, byte[ ] )This method is from 'HRegionInfo' class.
getHRegionInfoAndServerName ( client.Result )This method is from 'HRegionInfo' class.
getMergeRegions ( client.Result )This method is from 'HRegionInfo' class.
getRegionId ( )This method is from 'HRegionInfo' class.
getRegionName ( )This method is from 'HRegionInfo' class.
getRegionNameAsString ( )This method is from 'HRegionInfo' class.
getSeqNumDuringOpen ( client.Result )This method is from 'HRegionInfo' class.
getServerName ( client.Result )This method is from 'HRegionInfo' class.
getShortNameToLog ( )This method is from 'HRegionInfo' class.
getStartKey ( )This method is from 'HRegionInfo' class.
getStartKey ( byte[ ] )This method is from 'HRegionInfo' class.
getTable ( )This method is from 'HRegionInfo' class.
getTable ( byte[ ] )This method is from 'HRegionInfo' class.
getTableName ( )This method is from 'HRegionInfo' class.
getTableName ( byte[ ] )This method is from 'HRegionInfo' class.
getVersion ( )This method is from 'HRegionInfo' class.
hashCode ( )This method is from 'HRegionInfo' class.
HRegionInfo ( )This constructor is from 'HRegionInfo' class.
HRegionInfo ( HRegionInfo )This constructor is from 'HRegionInfo' class.
HRegionInfo ( TableName )This constructor is from 'HRegionInfo' class.
HRegionInfo ( TableName, byte[ ], byte[ ] )This constructor is from 'HRegionInfo' class.
HRegionInfo ( TableName, byte[ ], byte[ ], boolean )This constructor is from 'HRegionInfo' class.
HRegionInfo ( TableName, byte[ ], byte[ ], boolean, long )This constructor is from 'HRegionInfo' class.
isMetaRegion ( )This method is from 'HRegionInfo' class.
isMetaTable ( )This method is from 'HRegionInfo' class.
isOffline ( )This method is from 'HRegionInfo' class.
isSplit ( )This method is from 'HRegionInfo' class.
isSplitParent ( )This method is from 'HRegionInfo' class.
parseDelimitedFrom ( byte[ ], int, int )This method is from 'HRegionInfo' class.
parseFrom ( byte[ ] )This method is from 'HRegionInfo' class.
parseFrom ( byte[ ], int, int )This method is from 'HRegionInfo' class.
parseFrom ( java.io.DataInputStream )This method is from 'HRegionInfo' class.
parseFromOrNull ( byte[ ] )This method is from 'HRegionInfo' class.
parseFromOrNull ( byte[ ], int, int )This method is from 'HRegionInfo' class.
parseRegionName ( byte[ ] )This method is from 'HRegionInfo' class.
prettyPrint ( java.lang.String )This method is from 'HRegionInfo' class.
readFields ( java.io.DataInput )This method is from 'HRegionInfo' class.
setOffline ( boolean )This method is from 'HRegionInfo' class.
setSplit ( boolean )This method is from 'HRegionInfo' class.
toByteArray ( )This method is from 'HRegionInfo' class.
toDelimitedByteArray ( )This method is from 'HRegionInfo' class.
toDelimitedByteArray ( HRegionInfo... )This method is from 'HRegionInfo' class.
toString ( )This method is from 'HRegionInfo' class.
write ( java.io.DataOutput )This method is from 'HRegionInfo' class.
getRegionInfo ( )Return value of this method has type 'HRegionInfo'.
HRegionLocation ( HRegionInfo, ServerName )1st parameter 'regionInfo' of this method has type 'HRegionInfo'.
HRegionLocation ( HRegionInfo, ServerName, long )1st parameter 'regionInfo' of this method has type 'HRegionInfo'.
[+] HTableDescriptor (2)
| Change | Effect |
|---|
| 1 | Field DEFAULT_REGION_REPLICATION has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to DEFAULT_REGION_REPLICATION is ambiguous. |
| 2 | Field REGION_REPLICATION has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to REGION_REPLICATION is ambiguous. |
[+] affected methods (3)
getHTableDescriptor ( byte[ ] )Return value of this abstract method has type 'HTableDescriptor'.
getHTableDescriptor ( TableName )Return value of this abstract method has type 'HTableDescriptor'.
getTableDescriptor ( )Return value of this method has type 'HTableDescriptor'.
package org.apache.hadoop.hbase.client
[+] ClientScanner (3)
| Change | Effect |
|---|
| 1 | Field conf has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to conf is ambiguous. |
| 2 | Field pool has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to pool is ambiguous. |
| 3 | Field primaryOperationTimeout has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to primaryOperationTimeout is ambiguous. |
[+] affected methods (10)
checkScanStopRow ( byte[ ] )This method is from 'ClientScanner' class.
close ( )This method is from 'ClientScanner' class.
getScan ( )This method is from 'ClientScanner' class.
getTable ( )This method is from 'ClientScanner' class.
getTableName ( )This method is from 'ClientScanner' class.
getTimestamp ( )This method is from 'ClientScanner' class.
initializeScannerInConstruction ( )This method is from 'ClientScanner' class.
next ( )This method is from 'ClientScanner' class.
nextScanner ( int, boolean )This method is from 'ClientScanner' class.
writeScanMetrics ( )This method is from 'ClientScanner' class.
[+] HTable (1)
| Change | Effect |
|---|
| 1 | Field multiAp has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to multiAp is ambiguous. |
[+] affected methods (86)
append ( Append )This method is from 'HTable' class.
batch ( java.util.List<? extends Row> )This method is from 'HTable' class.
batch ( java.util.List<? extends Row>, java.lang.Object[ ] )This method is from 'HTable' class.
batchCallback ( java.util.List<? extends Row>, java.lang.Object[ ], coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
batchCallback ( java.util.List<? extends Row>, coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
batchCoprocessorService ( com.google.protobuf.Descriptors.MethodDescriptor, com.google.protobuf.Message, byte[ ], byte[ ], R )This method is from 'HTable' class.
batchCoprocessorService ( com.google.protobuf.Descriptors.MethodDescriptor, com.google.protobuf.Message, byte[ ], byte[ ], R, coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
checkAndDelete ( byte[ ], byte[ ], byte[ ], byte[ ], Delete )This method is from 'HTable' class.
checkAndMutate ( byte[ ], byte[ ], byte[ ], org.apache.hadoop.hbase.filter.CompareFilter.CompareOp, byte[ ], RowMutations )This method is from 'HTable' class.
checkAndPut ( byte[ ], byte[ ], byte[ ], byte[ ], Put )This method is from 'HTable' class.
clearRegionCache ( )This method is from 'HTable' class.
close ( )This method is from 'HTable' class.
coprocessorService ( byte[ ] )This method is from 'HTable' class.
coprocessorService ( java.lang.Class<T>, byte[ ], byte[ ], coprocessor.Batch.Call<T,R> )This method is from 'HTable' class.
coprocessorService ( java.lang.Class<T>, byte[ ], byte[ ], coprocessor.Batch.Call<T,R>, coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
delete ( java.util.List<Delete> )This method is from 'HTable' class.
delete ( Delete )This method is from 'HTable' class.
exists ( java.util.List<Get> )This method is from 'HTable' class.
exists ( Get )This method is from 'HTable' class.
flushCommits ( )This method is from 'HTable' class.
get ( java.util.List<Get> )This method is from 'HTable' class.
get ( Get )This method is from 'HTable' class.
getConfiguration ( )This method is from 'HTable' class.
getConnection ( )This method is from 'HTable' class.
getDefaultExecutor ( org.apache.hadoop.conf.Configuration )This method is from 'HTable' class.
getEndKeys ( )This method is from 'HTable' class.
getMaxKeyValueSize ( org.apache.hadoop.conf.Configuration )This method is from 'HTable' class.
getName ( )This method is from 'HTable' class.
getOperationTimeout ( )This method is from 'HTable' class.
getRegionCachePrefetch ( byte[ ] )This method is from 'HTable' class.
getRegionCachePrefetch ( org.apache.hadoop.conf.Configuration, byte[ ] )This method is from 'HTable' class.
getRegionCachePrefetch ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName )This method is from 'HTable' class.
getRegionCachePrefetch ( org.apache.hadoop.hbase.TableName )This method is from 'HTable' class.
getRegionLocation ( byte[ ] )This method is from 'HTable' class.
getRegionLocation ( byte[ ], boolean )This method is from 'HTable' class.
getRegionLocation ( java.lang.String )This method is from 'HTable' class.
getRegionLocations ( )This method is from 'HTable' class.
getRegionsInRange ( byte[ ], byte[ ] )This method is from 'HTable' class.
getRegionsInRange ( byte[ ], byte[ ], boolean )This method is from 'HTable' class.
getRowOrBefore ( byte[ ], byte[ ] )This method is from 'HTable' class.
getScanner ( byte[ ] )This method is from 'HTable' class.
getScanner ( byte[ ], byte[ ] )This method is from 'HTable' class.
getScanner ( Scan )This method is from 'HTable' class.
getScannerCaching ( )This method is from 'HTable' class.
getStartEndKeys ( )This method is from 'HTable' class.
getStartKeys ( )This method is from 'HTable' class.
getTableDescriptor ( )This method is from 'HTable' class.
getTableName ( )This method is from 'HTable' class.
getWriteBuffer ( )This method is from 'HTable' class.
getWriteBufferSize ( )This method is from 'HTable' class.
HTable ( )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, byte[ ] )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, byte[ ], java.util.concurrent.ExecutorService )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, java.lang.String )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName )This constructor is from 'HTable' class.
HTable ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName, java.util.concurrent.ExecutorService )This constructor is from 'HTable' class.
increment ( Increment )This method is from 'HTable' class.
incrementColumnValue ( byte[ ], byte[ ], byte[ ], long )This method is from 'HTable' class.
incrementColumnValue ( byte[ ], byte[ ], byte[ ], long, boolean )This method is from 'HTable' class.
incrementColumnValue ( byte[ ], byte[ ], byte[ ], long, Durability )This method is from 'HTable' class.
isAutoFlush ( )This method is from 'HTable' class.
isTableEnabled ( byte[ ] )This method is from 'HTable' class.
isTableEnabled ( java.lang.String )This method is from 'HTable' class.
isTableEnabled ( org.apache.hadoop.conf.Configuration, byte[ ] )This method is from 'HTable' class.
isTableEnabled ( org.apache.hadoop.conf.Configuration, java.lang.String )This method is from 'HTable' class.
isTableEnabled ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName )This method is from 'HTable' class.
isTableEnabled ( org.apache.hadoop.hbase.TableName )This method is from 'HTable' class.
mutateRow ( RowMutations )This method is from 'HTable' class.
processBatch ( java.util.List<? extends Row>, java.lang.Object[ ] )This method is from 'HTable' class.
processBatchCallback ( java.util.List<? extends Row>, java.lang.Object[ ], coprocessor.Batch.Callback<R> )This method is from 'HTable' class.
put ( java.util.List<Put> )This method is from 'HTable' class.
put ( Put )This method is from 'HTable' class.
setAutoFlush ( boolean )This method is from 'HTable' class.
setAutoFlush ( boolean, boolean )This method is from 'HTable' class.
setAutoFlushTo ( boolean )This method is from 'HTable' class.
setOperationTimeout ( int )This method is from 'HTable' class.
setRegionCachePrefetch ( byte[ ], boolean )This method is from 'HTable' class.
setRegionCachePrefetch ( org.apache.hadoop.conf.Configuration, byte[ ], boolean )This method is from 'HTable' class.
setRegionCachePrefetch ( org.apache.hadoop.conf.Configuration, org.apache.hadoop.hbase.TableName, boolean )This method is from 'HTable' class.
setRegionCachePrefetch ( org.apache.hadoop.hbase.TableName, boolean )This method is from 'HTable' class.
setScannerCaching ( int )This method is from 'HTable' class.
setWriteBufferSize ( long )This method is from 'HTable' class.
toString ( )This method is from 'HTable' class.
validatePut ( Put )This method is from 'HTable' class.
validatePut ( Put, int )This method is from 'HTable' class.
configureIncrementalLoad ( org.apache.hadoop.mapreduce.Job, HTable )2nd parameter 'table' of this method has type 'HTable'.
[+] Query (2)
| Change | Effect |
|---|
| 1 | Field consistency has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to consistency is ambiguous. |
| 2 | Field targetReplicaId has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to targetReplicaId is ambiguous. |
[+] affected methods (1)
setFilter ( org.apache.hadoop.hbase.filter.Filter )Return value of this method has type 'Query'.
hbase-common-0.98.9.jar
package org.apache.hadoop.hbase
[+] HBaseInterfaceAudience (1)
| Change | Effect |
|---|
| 1 | Field TOOLS has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to TOOLS is ambiguous. |
[+] affected methods (1)
HBaseInterfaceAudience ( )This constructor is from 'HBaseInterfaceAudience' class.
hbase-protocol-0.98.9.jar
package org.apache.hadoop.hbase.protobuf.generated
[+] HBaseProtos.RegionInfo (1)
| Change | Effect |
|---|
| 1 | Field REPLICA_ID_FIELD_NUMBER has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to REPLICA_ID_FIELD_NUMBER is ambiguous. |
[+] affected methods (2)
convert ( org.apache.hadoop.hbase.HRegionInfo )Return value of this method has type 'HBaseProtos.RegionInfo'.
convert ( HBaseProtos.RegionInfo )1st parameter 'proto' of this method has type 'HBaseProtos.RegionInfo'.
hbase-server-0.98.9.jar
package org.apache.hadoop.hbase.mapreduce
[+] ImportTsv (1)
| Change | Effect |
|---|
| 1 | Field NO_STRICT_COL_FAMILY has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to NO_STRICT_COL_FAMILY is ambiguous. |
[+] affected methods (4)
createSubmittableJob ( org.apache.hadoop.conf.Configuration, java.lang.String[ ] )This method is from 'ImportTsv' class.
ImportTsv ( )This constructor is from 'ImportTsv' class.
main ( java.lang.String[ ] )This method is from 'ImportTsv' class.
run ( java.lang.String[ ] )This method is from 'ImportTsv' class.
[+] TableInputFormat (1)
| Change | Effect |
|---|
| 1 | Field SHUFFLE_MAPS has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to SHUFFLE_MAPS is ambiguous. |
[+] affected methods (6)
addColumns ( org.apache.hadoop.hbase.client.Scan, byte[ ][ ] )This method is from 'TableInputFormat' class.
configureSplitTable ( org.apache.hadoop.mapreduce.Job, org.apache.hadoop.hbase.TableName )This method is from 'TableInputFormat' class.
getConf ( )This method is from 'TableInputFormat' class.
getStartEndKeys ( )This method is from 'TableInputFormat' class.
setConf ( org.apache.hadoop.conf.Configuration )This method is from 'TableInputFormat' class.
TableInputFormat ( )This constructor is from 'TableInputFormat' class.
package org.apache.hadoop.hbase.regionserver
[+] CompactSplitThread (8)
| Change | Effect |
|---|
| 1 | Field LARGE_COMPACTION_THREADS has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to LARGE_COMPACTION_THREADS is ambiguous. |
| 2 | Field LARGE_COMPACTION_THREADS_DEFAULT has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to LARGE_COMPACTION_THREADS_DEFAULT is ambiguous. |
| 3 | Field MERGE_THREADS has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to MERGE_THREADS is ambiguous. |
| 4 | Field MERGE_THREADS_DEFAULT has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to MERGE_THREADS_DEFAULT is ambiguous. |
| 5 | Field SMALL_COMPACTION_THREADS has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to SMALL_COMPACTION_THREADS is ambiguous. |
| 6 | Field SMALL_COMPACTION_THREADS_DEFAULT has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to SMALL_COMPACTION_THREADS_DEFAULT is ambiguous. |
| 7 | Field SPLIT_THREADS has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to SPLIT_THREADS is ambiguous. |
| 8 | Field SPLIT_THREADS_DEFAULT has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to SPLIT_THREADS_DEFAULT is ambiguous. |
[+] affected methods (1)
getRegionServer ( int )Field 'retval.compactSplitThread' in return value of this method has type 'CompactSplitThread'.
[+] HRegionServer (7)
| Change | Effect |
|---|
| 1 | Field clusterConnection has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to clusterConnection is ambiguous. |
| 2 | Field configurationManager has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to configurationManager is ambiguous. |
| 3 | Field csm has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to csm is ambiguous. |
| 4 | Field hMemManager has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to hMemManager is ambiguous. |
| 5 | Field metaTableLocator has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to metaTableLocator is ambiguous. |
| 6 | Field rpcServices has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to rpcServices is ambiguous. |
| 7 | Field walFactory has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to walFactory is ambiguous. |
[+] affected methods (1)
getRegionServer ( int )Return value of this method has type 'HRegionServer'.
[+] MemStoreFlusher (1)
| Change | Effect |
|---|
| 1 | Field globalMemStoreLimitLowMarkPercent has been added to this class. | No effect. NOTE: A static field from a super-interface of a client class may hide an added field (with the same name) inherited from the super-class of a client class. Recompilation of a client class may be terminated with the message: reference to globalMemStoreLimitLowMarkPercent is ambiguous. |
[+] affected methods (1)
getRegionServer ( int )Field 'retval.cacheFlusher' in return value of this method has type 'MemStoreFlusher'.
to the top
Java ARchives (15)
hbase-annotations-0.98.9.jar
hbase-checkstyle-0.98.9.jar
hbase-client-0.98.9.jar
hbase-common-0.98.9.jar
hbase-examples-0.98.9.jar
hbase-hadoop-compat-0.98.9.jar
hbase-hadoop2-compat-0.98.9.jar
hbase-it-0.98.9.jar
hbase-prefix-tree-0.98.9.jar
hbase-protocol-0.98.9.jar
hbase-rest-0.98.9.jar
hbase-server-0.98.9.jar
hbase-shell-0.98.9.jar
hbase-testing-util-0.98.9.jar
hbase-thrift-0.98.9.jar
to the top