Uploaded image for project: 'HBase'
  1. HBase
  2. HBASE-20586

SyncTable tool: Add support for cross-realm remote clusters

    XMLWordPrintableJSON

Details

    • Reviewed

    Description

      One possible scenario for HashTable/SyncTable is for synchronize different clusters, for instance, when replication has been enabled but data existed already, or due replication issues that may had caused long lags in the replication.

      For secured clusters under different kerberos realms (with cross-realm properly set), though, current SyncTable version would fail to authenticate with the remote cluster when trying to read HashTable outputs (when sourcehashdir is remote) and also when trying to read table data on the remote cluster (when sourcezkcluster is remote).

      The hdfs error would look like this:

      INFO mapreduce.Job: Task Id : attempt_1524358175778_105392_m_000000_0, Status : FAILED
      
      Error: java.io.IOException: Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local host is: "local-host/1.1.1.1"; destination host is: "remote-nn":8020;
              at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
              at org.apache.hadoop.ipc.Client.call(Client.java:1506)
              at org.apache.hadoop.ipc.Client.call(Client.java:1439)
              at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
              at com.sun.proxy.$Proxy13.getBlockLocations(Unknown Source)
              at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:256)
      ...
              at org.apache.hadoop.hbase.mapreduce.HashTable$TableHash.readPropertiesFile(HashTable.java:144)
              at org.apache.hadoop.hbase.mapreduce.HashTable$TableHash.read(HashTable.java:105)
              at org.apache.hadoop.hbase.mapreduce.SyncTable$SyncMapper.setup(SyncTable.java:188)
      at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:142)
      ...
      Caused by: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]

      The above can be sorted if the SyncTable job acquires a DT for the remote NN. Once hdfs related authentication is done, it's also necessary to authenticate against remote HBase, as the below error would arise:

      INFO mapreduce.Job: Task Id : attempt_1524358175778_172414_m_000000_0, Status : FAILED
      Error: org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the location
      at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:326)
      ...
      at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:867)
      at org.apache.hadoop.hbase.mapreduce.SyncTable$SyncMapper.syncRange(SyncTable.java:331)
      ...
      Caused by: java.io.IOException: Could not set up IO Streams to remote-rs-host/1.1.1.2:60020
      at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:786)
      ...
      Caused by: java.lang.RuntimeException: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
      ...
      Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
      ...

      The above would need additional authentication logic against the remote hbase cluster.

      Attachments

        1. HBASE-20586.master.001.patch
          3 kB
          Wellington Chevreuil

        Issue Links

          Activity

            People

              wchevreuil Wellington Chevreuil
              wchevreuil Wellington Chevreuil
              Votes:
              0 Vote for this issue
              Watchers:
              6 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: