Details

    • Type: New Feature New Feature
    • Status: Resolved
    • Priority: Major Major
    • Resolution: Duplicate
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: None
    • Labels:
      None

      Description

      It would be nice to be able to rename tables, if this is possible. Some of our internal users are doing things like: upload table mytable -> realize they screwed up -> upload table mytable_2 -> decide mytable_2 looks better -> have to go on using mytable_2 instead of originally desired table name.

      1. copy_table.rb
        5 kB
        stack
      2. rename_table.rb
        6 kB
        stack

        Issue Links

          Activity

          Hide
          Harsh J added a comment -

          Here's a link to Shrijeet Paliwal's java code snippet that seems to work (may need extra disable/enable and a hbck -fix afterwards though): https://gist.github.com/3913529. This was posted on the user lists.

          Show
          Harsh J added a comment - Here's a link to Shrijeet Paliwal 's java code snippet that seems to work (may need extra disable/enable and a hbck -fix afterwards though): https://gist.github.com/3913529 . This was posted on the user lists.
          Hide
          stack added a comment -

          @Devin No. What is attached here has rotted. It would need work to update it. Even then, the script was a band-aid. What is needed is our keeping an id rather than the table name everywhere so a rename requires our changing the tablename once in a table to id map rather than in a few places in the filesystem as well as in the metadata kept by each table region.

          Show
          stack added a comment - @Devin No. What is attached here has rotted. It would need work to update it. Even then, the script was a band-aid. What is needed is our keeping an id rather than the table name everywhere so a rename requires our changing the tablename once in a table to id map rather than in a few places in the filesystem as well as in the metadata kept by each table region.
          Hide
          Devin Bayer added a comment -

          Hi. Is it safe to use the rename_table.rb script attached to this bug? Which versions does it work with?

          Show
          Devin Bayer added a comment - Hi. Is it safe to use the rename_table.rb script attached to this bug? Which versions does it work with?
          Hide
          stack added a comment -

          @Sameer I don't think so. The rename script was recently removed because it had rotted; it had not been updated to match changed API. We really need something like ids for tables, something like what Keith talks of in the above so a rename is a near-costless operations (as opposed to the rewrite of .META. and HDFS dir name that the rename script used do).

          Show
          stack added a comment - @Sameer I don't think so. The rename script was recently removed because it had rotted; it had not been updated to match changed API. We really need something like ids for tables, something like what Keith talks of in the above so a rename is a near-costless operations (as opposed to the rewrite of .META. and HDFS dir name that the rename script used do).
          Hide
          Sameer Vaishampayan added a comment -

          @stack Seems this bug can be closed ?

          Show
          Sameer Vaishampayan added a comment - @stack Seems this bug can be closed ?
          Hide
          Keith Turner added a comment -

          Accumulo supports this feature by using table ids. Tables ids are generated using zookeeper and are never reused (base 36 numbers are used to keep them short and readable). A mapping from table id to table name is stored in zookeeper. To rename a table, lock the table and change the mapping in zookeeper.

          Accumulo used to not use table ids, it stored the table name in meta and hdfs. Now it uses the table id in hdfs and meta. We were discussing renaming tables, and it seemed so complicated. Then someone thought of this table id solution, it was such an elegant solution and made the problem trivial.

          Although table ids were implemented to support table renaming, they had the nice side effect of making hdfs and meta entries much shorter.

          Show
          Keith Turner added a comment - Accumulo supports this feature by using table ids. Tables ids are generated using zookeeper and are never reused (base 36 numbers are used to keep them short and readable). A mapping from table id to table name is stored in zookeeper. To rename a table, lock the table and change the mapping in zookeeper. Accumulo used to not use table ids, it stored the table name in meta and hdfs. Now it uses the table id in hdfs and meta. We were discussing renaming tables, and it seemed so complicated. Then someone thought of this table id solution, it was such an elegant solution and made the problem trivial. Although table ids were implemented to support table renaming, they had the nice side effect of making hdfs and meta entries much shorter.
          Hide
          stack added a comment -

          bin/rename_table.rb has been updated to match new api in 0.20 and trunk.

          Show
          stack added a comment - bin/rename_table.rb has been updated to match new api in 0.20 and trunk.
          Hide
          Erik Holstad added a comment -

          Yeah, I will deal with this. I have been working on it for the last couple of days, but have had some problems with versions and stuff, maybe you can give me a had later today or so?

          Show
          Erik Holstad added a comment - Yeah, I will deal with this. I have been working on it for the last couple of days, but have had some problems with versions and stuff, maybe you can give me a had later today or so?
          Hide
          stack added a comment -

          You interested in this one Erik?

          Show
          stack added a comment - You interested in this one Erik?
          Hide
          Yan Liu added a comment -

          I am running hbase-0.19.0 on EC2, when I tried to use that "rename_table.rb", I hit the following problem:

          bin/hbase org.jruby.Main /mnt/rename_table.rb 1001_profiles 1001_profiles_backup

          09/02/18 13:19:27 INFO regionserver.HLog: New log writer: /user/root/log_1234981167000/hlog.dat.1234981167004
          09/02/18 13:19:27 INFO util.NativeCodeLoader: Loaded the native-hadoop library
          09/02/18 13:19:27 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
          09/02/18 13:19:27 INFO compress.CodecPool: Got brand-new decompressor
          09/02/18 13:19:27 INFO compress.CodecPool: Got brand-new decompressor
          09/02/18 13:19:27 INFO compress.CodecPool: Got brand-new decompressor
          09/02/18 13:19:27 INFO compress.CodecPool: Got brand-new decompressor
          09/02/18 13:19:27 INFO regionserver.HRegion: region ROOT,,0/70236052 available
          09/02/18 13:19:27 INFO regionserver.HRegion: starting compaction on region ROOT,,0
          09/02/18 13:19:27 INFO compress.CodecPool: Got brand-new compressor
          09/02/18 13:19:27 INFO regionserver.HRegion: compaction completed on region ROOT,,0 in 0sec
          09/02/18 13:19:27 INFO rename_table: Scanning .META.,,1
          09/02/18 13:19:27 INFO regionserver.HRegion: region .META.,,1/1028785192 available
          09/02/18 13:19:27 INFO regionserver.HRegion: starting compaction on region .META.,,1
          09/02/18 13:19:28 INFO regionserver.HRegion: compaction completed on region .META.,,1 in 0sec
          09/02/18 13:19:28 INFO rename_table: Renaming hdfs://domU-12-31-39-03-BD-A7.compute-1.internal:50001/hbase/1001_profiles/1153297718 as hdfs://domU-12-31-39-03-BD-A7.compute-1.internal:50001/hbase/1001_profiles_backup/1047320069
          09/02/18 13:19:28 INFO rename_table: Removing 1001_profiles,,1234593264387 from .META.
          09/02/18 13:19:28 INFO regionserver.HRegion: Closed ROOT,,0
          09/02/18 13:19:28 INFO regionserver.HRegion: Closed .META.,,1
          09/02/18 13:19:28 INFO regionserver.HLog: Closed hdfs://domU-12-31-39-03-BD-A7.compute-1.internal:50001/user/root/log_1234981167000/hlog.dat.0, entries=0. New log writer: /user/root/log_1234981167000/hlog.dat.1234981168417
          09/02/18 13:19:28 INFO regionserver.HLog: removing old log file /user/root/log_1234981167000/hlog.dat.0 whose highest sequence/edit id is 75001755
          file:/usr/local/hbase-0.19.0/lib/jruby-complete-1.1.2.jar!/builtin/java/collections.rb:29: no deleteAll with arguments matching [class [B, class java.lang.Long] on object #<Java::OrgApacheHadoopHbaseRegionserver::HRegion:0xa8a314 @java_object=.META.,,1> (NameError)
          from file:/usr/local/hbase-0.19.0/lib/jruby-complete-1.1.2.jar!/builtin/java/collections.rb:29:in `call'
          from file:/usr/local/hbase-0.19.0/lib/jruby-complete-1.1.2.jar!/builtin/java/collections.rb:29:in `each'
          from /mnt/rename_table.rb:100

          After that, I can't even do a "list" command in HBase shell When I issued "list" in the shell, I see the following dump:

          hbase(main):001:0> list
          NativeException: org.apache.hadoop.hbase.client.RetriesExhaustedException: Trying to contact region server 10.249.190.85:60020 for region .META.,,1, row '', but failed after 5 attempts.
          Exceptions:
          java.io.IOException: java.io.IOException: HStoreScanner failed construction
          at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:70)
          at org.apache.hadoop.hbase.regionserver.HStoreScanner.<init>(HStoreScanner.java:88)
          at org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2125)
          at org.apache.hadoop.hbase.regionserver.HRegion$HScanner.<init>(HRegion.java:1989)
          at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1180)
          at org.apache.hadoop.hbase.regionserver.HRegionServer.openScanner(HRegionServer.java:1700)
          at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
          at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
          at java.lang.reflect.Method.invoke(Method.java:597)
          at org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:632)
          at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:895)
          Caused by: java.io.FileNotFoundException: File does not exist: hdfs://domU-12-31-39-03-BD-A7.compute-1.internal:50001/hbase/.META./1028785192/info/mapfiles/1397620458287085628/data
          at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:394)
          at org.apache.hadoop.fs.FileSystem.getLength(FileSystem.java:679)
          at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1431)
          at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1426)
          at org.apache.hadoop.hbase.io.MapFile$Reader.createDataFileReader(MapFile.java:310)
          at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.createDataFileReader(HBaseMapFile.java:96)
          at org.apache.hadoop.hbase.io.MapFile$Reader.open(MapFile.java:292)
          at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.<init>(HBaseMapFile.java:79)
          at org.apache.hadoop.hbase.io.BloomFilterMapFile$Reader.<init>(BloomFilterMapFile.java:65)
          at org.apache.hadoop.hbase.regionserver.HStoreFile.getReader(HStoreFile.java:443)
          at org.apache.hadoop.hbase.regionserver.StoreFileScanner.openReaders(StoreFileScanner.java:96)
          at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:67)
          ... 10 more

          java.io.IOException: java.io.IOException: HStoreScanner failed construction
          at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:70)
          at org.apache.hadoop.hbase.regionserver.HStoreScanner.<init>(HStoreScanner.java:88)
          at org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2125)
          at org.apache.hadoop.hbase.regionserver.HRegion$HScanner.<init>(HRegion.java:1989)
          at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1180)
          at org.apache.hadoop.hbase.regionserver.HRegionServer.openScanner(HRegionServer.java:1700)
          at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
          at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
          at java.lang.reflect.Method.invoke(Method.java:597)
          at org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:632)
          at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:895)
          Caused by: java.io.FileNotFoundException: File does not exist: hdfs://domU-12-31-39-03-BD-A7.compute-1.internal:50001/hbase/.META./1028785192/info/mapfiles/1397620458287085628/data
          at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:394)
          at org.apache.hadoop.fs.FileSystem.getLength(FileSystem.java:679)
          at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1431)
          at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1426)
          at org.apache.hadoop.hbase.io.MapFile$Reader.createDataFileReader(MapFile.java:310)
          at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.createDataFileReader(HBaseMapFile.java:96)
          at org.apache.hadoop.hbase.io.MapFile$Reader.open(MapFile.java:292)
          at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.<init>(HBaseMapFile.java:79)
          at org.apache.hadoop.hbase.io.BloomFilterMapFile$Reader.<init>(BloomFilterMapFile.java:65)
          at org.apache.hadoop.hbase.regionserver.HStoreFile.getReader(HStoreFile.java:443)
          at org.apache.hadoop.hbase.regionserver.StoreFileScanner.openReaders(StoreFileScanner.java:96)
          at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:67)
          ... 10 more

          java.io.IOException: java.io.IOException: HStoreScanner failed construction
          at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:70)
          at org.apache.hadoop.hbase.regionserver.HStoreScanner.<init>(HStoreScanner.java:88)
          at org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2125)
          at org.apache.hadoop.hbase.regionserver.HRegion$HScanner.<init>(HRegion.java:1989)
          at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1180)
          at org.apache.hadoop.hbase.regionserver.HRegionServer.openScanner(HRegionServer.java:1700)
          at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
          at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
          at java.lang.reflect.Method.invoke(Method.java:597)
          at org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:632)
          at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:895)
          Caused by: java.io.FileNotFoundException: File does not exist: hdfs://domU-12-31-39-03-BD-A7.compute-1.internal:50001/hbase/.META./1028785192/info/mapfiles/1397620458287085628/data
          at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:394)
          at org.apache.hadoop.fs.FileSystem.getLength(FileSystem.java:679)
          at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1431)
          at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1426)
          at org.apache.hadoop.hbase.io.MapFile$Reader.createDataFileReader(MapFile.java:310)
          at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.createDataFileReader(HBaseMapFile.java:96)
          at org.apache.hadoop.hbase.io.MapFile$Reader.open(MapFile.java:292)
          at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.<init>(HBaseMapFile.java:79)
          at org.apache.hadoop.hbase.io.BloomFilterMapFile$Reader.<init>(BloomFilterMapFile.java:65)
          at org.apache.hadoop.hbase.regionserver.HStoreFile.getReader(HStoreFile.java:443)
          at org.apache.hadoop.hbase.regionserver.StoreFileScanner.openReaders(StoreFileScanner.java:96)
          at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:67)
          ... 10 more

          java.io.IOException: java.io.IOException: HStoreScanner failed construction
          at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:70)
          at org.apache.hadoop.hbase.regionserver.HStoreScanner.<init>(HStoreScanner.java:88)
          at org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2125)
          at org.apache.hadoop.hbase.regionserver.HRegion$HScanner.<init>(HRegion.java:1989)
          at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1180)
          at org.apache.hadoop.hbase.regionserver.HRegionServer.openScanner(HRegionServer.java:1700)
          at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
          at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
          at java.lang.reflect.Method.invoke(Method.java:597)
          at org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:632)
          at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:895)
          Caused by: java.io.FileNotFoundException: File does not exist: hdfs://domU-12-31-39-03-BD-A7.compute-1.internal:50001/hbase/.META./1028785192/info/mapfiles/1397620458287085628/data
          at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:394)
          at org.apache.hadoop.fs.FileSystem.getLength(FileSystem.java:679)
          at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1431)
          at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1426)
          at org.apache.hadoop.hbase.io.MapFile$Reader.createDataFileReader(MapFile.java:310)
          at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.createDataFileReader(HBaseMapFile.java:96)
          at org.apache.hadoop.hbase.io.MapFile$Reader.open(MapFile.java:292)
          at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.<init>(HBaseMapFile.java:79)
          at org.apache.hadoop.hbase.io.BloomFilterMapFile$Reader.<init>(BloomFilterMapFile.java:65)
          at org.apache.hadoop.hbase.regionserver.HStoreFile.getReader(HStoreFile.java:443)
          at org.apache.hadoop.hbase.regionserver.StoreFileScanner.openReaders(StoreFileScanner.java:96)
          at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:67)
          ... 10 more

          java.io.IOException: java.io.IOException: HStoreScanner failed construction
          at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:70)
          at org.apache.hadoop.hbase.regionserver.HStoreScanner.<init>(HStoreScanner.java:88)
          at org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2125)
          at org.apache.hadoop.hbase.regionserver.HRegion$HScanner.<init>(HRegion.java:1989)
          at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1180)
          at org.apache.hadoop.hbase.regionserver.HRegionServer.openScanner(HRegionServer.java:1700)
          at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
          at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
          at java.lang.reflect.Method.invoke(Method.java:597)
          at org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:632)
          at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:895)
          Caused by: java.io.FileNotFoundException: File does not exist: hdfs://domU-12-31-39-03-BD-A7.compute-1.internal:50001/hbase/.META./1028785192/info/mapfiles/1397620458287085628/data
          at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:394)
          at org.apache.hadoop.fs.FileSystem.getLength(FileSystem.java:679)
          at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1431)
          at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1426)
          at org.apache.hadoop.hbase.io.MapFile$Reader.createDataFileReader(MapFile.java:310)
          at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.createDataFileReader(HBaseMapFile.java:96)
          at org.apache.hadoop.hbase.io.MapFile$Reader.open(MapFile.java:292)
          at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.<init>(HBaseMapFile.java:79)
          at org.apache.hadoop.hbase.io.BloomFilterMapFile$Reader.<init>(BloomFilterMapFile.java:65)
          at org.apache.hadoop.hbase.regionserver.HStoreFile.getReader(HStoreFile.java:443)
          at org.apache.hadoop.hbase.regionserver.StoreFileScanner.openReaders(StoreFileScanner.java:96)
          at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:67)
          ... 10 more

          from org/apache/hadoop/hbase/client/HConnectionManager.java:841:in `getRegionServerWithRetries'
          from org/apache/hadoop/hbase/client/MetaScanner.java:56:in `metaScan'
          from org/apache/hadoop/hbase/client/MetaScanner.java:30:in `metaScan'
          from org/apache/hadoop/hbase/client/HConnectionManager.java:311:in `listTables'
          from org/apache/hadoop/hbase/client/HBaseAdmin.java:122:in `listTables'
          from sun/reflect/NativeMethodAccessorImpl.java:-2:in `invoke0'
          from sun/reflect/NativeMethodAccessorImpl.java:39:in `invoke'
          from sun/reflect/DelegatingMethodAccessorImpl.java:25:in `invoke'
          from java/lang/reflect/Method.java:597:in `invoke'
          from org/jruby/javasupport/JavaMethod.java:250:in `invokeWithExceptionHandling'
          from org/jruby/javasupport/JavaMethod.java:219:in `invoke'
          from org/jruby/javasupport/JavaClass.java:416:in `execute'
          from org/jruby/internal/runtime/methods/SimpleCallbackMethod.java:67:in `call'
          from org/jruby/internal/runtime/methods/DynamicMethod.java:70:in `call'
          from org/jruby/runtime/CallSite.java:123:in `cacheAndCall'
          from org/jruby/runtime/CallSite.java:298:in `call'
          ... 130 levels...
          from ruby.usr.local.hbase_minus_0_dot_19_dot_0.bin.hirbInvokermethod__32$RUBY$startOpt:-1:in `call'
          from org/jruby/internal/runtime/methods/DynamicMethod.java:74:in `call'
          from org/jruby/internal/runtime/methods/CompiledMethod.java:48:in `call'
          from org/jruby/runtime/CallSite.java:123:in `cacheAndCall'
          from org/jruby/runtime/CallSite.java:298:in `call'
          from ruby/usr/local/hbase_minus_0_dot_19_dot_0/bin//usr/local/hbase-0.19.0/bin/../bin/hirb.rb:429:in `_file_'
          from ruby/usr/local/hbase_minus_0_dot_19_dot_0/bin//usr/local/hbase-0.19.0/bin/../bin/hirb.rb:-1:in `_file_'
          from ruby/usr/local/hbase_minus_0_dot_19_dot_0/bin//usr/local/hbase-0.19.0/bin/../bin/hirb.rb:-1:in `load'
          from org/jruby/Ruby.java:512:in `runScript'
          from org/jruby/Ruby.java:432:in `runNormally'
          from org/jruby/Ruby.java:312:in `runFromMain'
          from org/jruby/Main.java:144:in `run'
          from org/jruby/Main.java:89:in `run'
          from org/jruby/Main.java:80:in `main'
          from /usr/local/hbase-0.19.0/bin/../bin/hirb.rb:288:in `list'

          Show
          Yan Liu added a comment - I am running hbase-0.19.0 on EC2, when I tried to use that "rename_table.rb", I hit the following problem: bin/hbase org.jruby.Main /mnt/rename_table.rb 1001_profiles 1001_profiles_backup 09/02/18 13:19:27 INFO regionserver.HLog: New log writer: /user/root/log_1234981167000/hlog.dat.1234981167004 09/02/18 13:19:27 INFO util.NativeCodeLoader: Loaded the native-hadoop library 09/02/18 13:19:27 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library 09/02/18 13:19:27 INFO compress.CodecPool: Got brand-new decompressor 09/02/18 13:19:27 INFO compress.CodecPool: Got brand-new decompressor 09/02/18 13:19:27 INFO compress.CodecPool: Got brand-new decompressor 09/02/18 13:19:27 INFO compress.CodecPool: Got brand-new decompressor 09/02/18 13:19:27 INFO regionserver.HRegion: region ROOT ,,0/70236052 available 09/02/18 13:19:27 INFO regionserver.HRegion: starting compaction on region ROOT ,,0 09/02/18 13:19:27 INFO compress.CodecPool: Got brand-new compressor 09/02/18 13:19:27 INFO regionserver.HRegion: compaction completed on region ROOT ,,0 in 0sec 09/02/18 13:19:27 INFO rename_table: Scanning .META.,,1 09/02/18 13:19:27 INFO regionserver.HRegion: region .META.,,1/1028785192 available 09/02/18 13:19:27 INFO regionserver.HRegion: starting compaction on region .META.,,1 09/02/18 13:19:28 INFO regionserver.HRegion: compaction completed on region .META.,,1 in 0sec 09/02/18 13:19:28 INFO rename_table: Renaming hdfs://domU-12-31-39-03-BD-A7.compute-1.internal:50001/hbase/1001_profiles/1153297718 as hdfs://domU-12-31-39-03-BD-A7.compute-1.internal:50001/hbase/1001_profiles_backup/1047320069 09/02/18 13:19:28 INFO rename_table: Removing 1001_profiles,,1234593264387 from .META. 09/02/18 13:19:28 INFO regionserver.HRegion: Closed ROOT ,,0 09/02/18 13:19:28 INFO regionserver.HRegion: Closed .META.,,1 09/02/18 13:19:28 INFO regionserver.HLog: Closed hdfs://domU-12-31-39-03-BD-A7.compute-1.internal:50001/user/root/log_1234981167000/hlog.dat.0, entries=0. New log writer: /user/root/log_1234981167000/hlog.dat.1234981168417 09/02/18 13:19:28 INFO regionserver.HLog: removing old log file /user/root/log_1234981167000/hlog.dat.0 whose highest sequence/edit id is 75001755 file:/usr/local/hbase-0.19.0/lib/jruby-complete-1.1.2.jar!/builtin/java/collections.rb:29: no deleteAll with arguments matching [class [B, class java.lang.Long] on object #<Java::OrgApacheHadoopHbaseRegionserver::HRegion:0xa8a314 @java_object=.META.,,1> (NameError) from file:/usr/local/hbase-0.19.0/lib/jruby-complete-1.1.2.jar!/builtin/java/collections.rb:29:in `call' from file:/usr/local/hbase-0.19.0/lib/jruby-complete-1.1.2.jar!/builtin/java/collections.rb:29:in `each' from /mnt/rename_table.rb:100 After that, I can't even do a "list" command in HBase shell When I issued "list" in the shell, I see the following dump: hbase(main):001:0> list NativeException: org.apache.hadoop.hbase.client.RetriesExhaustedException: Trying to contact region server 10.249.190.85:60020 for region .META.,,1, row '', but failed after 5 attempts. Exceptions: java.io.IOException: java.io.IOException: HStoreScanner failed construction at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:70) at org.apache.hadoop.hbase.regionserver.HStoreScanner.<init>(HStoreScanner.java:88) at org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2125) at org.apache.hadoop.hbase.regionserver.HRegion$HScanner.<init>(HRegion.java:1989) at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1180) at org.apache.hadoop.hbase.regionserver.HRegionServer.openScanner(HRegionServer.java:1700) at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:632) at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:895) Caused by: java.io.FileNotFoundException: File does not exist: hdfs://domU-12-31-39-03-BD-A7.compute-1.internal:50001/hbase/.META./1028785192/info/mapfiles/1397620458287085628/data at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:394) at org.apache.hadoop.fs.FileSystem.getLength(FileSystem.java:679) at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1431) at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1426) at org.apache.hadoop.hbase.io.MapFile$Reader.createDataFileReader(MapFile.java:310) at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.createDataFileReader(HBaseMapFile.java:96) at org.apache.hadoop.hbase.io.MapFile$Reader.open(MapFile.java:292) at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.<init>(HBaseMapFile.java:79) at org.apache.hadoop.hbase.io.BloomFilterMapFile$Reader.<init>(BloomFilterMapFile.java:65) at org.apache.hadoop.hbase.regionserver.HStoreFile.getReader(HStoreFile.java:443) at org.apache.hadoop.hbase.regionserver.StoreFileScanner.openReaders(StoreFileScanner.java:96) at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:67) ... 10 more java.io.IOException: java.io.IOException: HStoreScanner failed construction at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:70) at org.apache.hadoop.hbase.regionserver.HStoreScanner.<init>(HStoreScanner.java:88) at org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2125) at org.apache.hadoop.hbase.regionserver.HRegion$HScanner.<init>(HRegion.java:1989) at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1180) at org.apache.hadoop.hbase.regionserver.HRegionServer.openScanner(HRegionServer.java:1700) at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:632) at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:895) Caused by: java.io.FileNotFoundException: File does not exist: hdfs://domU-12-31-39-03-BD-A7.compute-1.internal:50001/hbase/.META./1028785192/info/mapfiles/1397620458287085628/data at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:394) at org.apache.hadoop.fs.FileSystem.getLength(FileSystem.java:679) at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1431) at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1426) at org.apache.hadoop.hbase.io.MapFile$Reader.createDataFileReader(MapFile.java:310) at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.createDataFileReader(HBaseMapFile.java:96) at org.apache.hadoop.hbase.io.MapFile$Reader.open(MapFile.java:292) at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.<init>(HBaseMapFile.java:79) at org.apache.hadoop.hbase.io.BloomFilterMapFile$Reader.<init>(BloomFilterMapFile.java:65) at org.apache.hadoop.hbase.regionserver.HStoreFile.getReader(HStoreFile.java:443) at org.apache.hadoop.hbase.regionserver.StoreFileScanner.openReaders(StoreFileScanner.java:96) at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:67) ... 10 more java.io.IOException: java.io.IOException: HStoreScanner failed construction at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:70) at org.apache.hadoop.hbase.regionserver.HStoreScanner.<init>(HStoreScanner.java:88) at org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2125) at org.apache.hadoop.hbase.regionserver.HRegion$HScanner.<init>(HRegion.java:1989) at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1180) at org.apache.hadoop.hbase.regionserver.HRegionServer.openScanner(HRegionServer.java:1700) at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:632) at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:895) Caused by: java.io.FileNotFoundException: File does not exist: hdfs://domU-12-31-39-03-BD-A7.compute-1.internal:50001/hbase/.META./1028785192/info/mapfiles/1397620458287085628/data at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:394) at org.apache.hadoop.fs.FileSystem.getLength(FileSystem.java:679) at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1431) at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1426) at org.apache.hadoop.hbase.io.MapFile$Reader.createDataFileReader(MapFile.java:310) at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.createDataFileReader(HBaseMapFile.java:96) at org.apache.hadoop.hbase.io.MapFile$Reader.open(MapFile.java:292) at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.<init>(HBaseMapFile.java:79) at org.apache.hadoop.hbase.io.BloomFilterMapFile$Reader.<init>(BloomFilterMapFile.java:65) at org.apache.hadoop.hbase.regionserver.HStoreFile.getReader(HStoreFile.java:443) at org.apache.hadoop.hbase.regionserver.StoreFileScanner.openReaders(StoreFileScanner.java:96) at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:67) ... 10 more java.io.IOException: java.io.IOException: HStoreScanner failed construction at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:70) at org.apache.hadoop.hbase.regionserver.HStoreScanner.<init>(HStoreScanner.java:88) at org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2125) at org.apache.hadoop.hbase.regionserver.HRegion$HScanner.<init>(HRegion.java:1989) at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1180) at org.apache.hadoop.hbase.regionserver.HRegionServer.openScanner(HRegionServer.java:1700) at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:632) at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:895) Caused by: java.io.FileNotFoundException: File does not exist: hdfs://domU-12-31-39-03-BD-A7.compute-1.internal:50001/hbase/.META./1028785192/info/mapfiles/1397620458287085628/data at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:394) at org.apache.hadoop.fs.FileSystem.getLength(FileSystem.java:679) at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1431) at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1426) at org.apache.hadoop.hbase.io.MapFile$Reader.createDataFileReader(MapFile.java:310) at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.createDataFileReader(HBaseMapFile.java:96) at org.apache.hadoop.hbase.io.MapFile$Reader.open(MapFile.java:292) at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.<init>(HBaseMapFile.java:79) at org.apache.hadoop.hbase.io.BloomFilterMapFile$Reader.<init>(BloomFilterMapFile.java:65) at org.apache.hadoop.hbase.regionserver.HStoreFile.getReader(HStoreFile.java:443) at org.apache.hadoop.hbase.regionserver.StoreFileScanner.openReaders(StoreFileScanner.java:96) at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:67) ... 10 more java.io.IOException: java.io.IOException: HStoreScanner failed construction at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:70) at org.apache.hadoop.hbase.regionserver.HStoreScanner.<init>(HStoreScanner.java:88) at org.apache.hadoop.hbase.regionserver.HStore.getScanner(HStore.java:2125) at org.apache.hadoop.hbase.regionserver.HRegion$HScanner.<init>(HRegion.java:1989) at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1180) at org.apache.hadoop.hbase.regionserver.HRegionServer.openScanner(HRegionServer.java:1700) at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.hbase.ipc.HBaseRPC$Server.call(HBaseRPC.java:632) at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:895) Caused by: java.io.FileNotFoundException: File does not exist: hdfs://domU-12-31-39-03-BD-A7.compute-1.internal:50001/hbase/.META./1028785192/info/mapfiles/1397620458287085628/data at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:394) at org.apache.hadoop.fs.FileSystem.getLength(FileSystem.java:679) at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1431) at org.apache.hadoop.hbase.io.SequenceFile$Reader.<init>(SequenceFile.java:1426) at org.apache.hadoop.hbase.io.MapFile$Reader.createDataFileReader(MapFile.java:310) at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.createDataFileReader(HBaseMapFile.java:96) at org.apache.hadoop.hbase.io.MapFile$Reader.open(MapFile.java:292) at org.apache.hadoop.hbase.io.HBaseMapFile$HBaseReader.<init>(HBaseMapFile.java:79) at org.apache.hadoop.hbase.io.BloomFilterMapFile$Reader.<init>(BloomFilterMapFile.java:65) at org.apache.hadoop.hbase.regionserver.HStoreFile.getReader(HStoreFile.java:443) at org.apache.hadoop.hbase.regionserver.StoreFileScanner.openReaders(StoreFileScanner.java:96) at org.apache.hadoop.hbase.regionserver.StoreFileScanner.<init>(StoreFileScanner.java:67) ... 10 more from org/apache/hadoop/hbase/client/HConnectionManager.java:841:in `getRegionServerWithRetries' from org/apache/hadoop/hbase/client/MetaScanner.java:56:in `metaScan' from org/apache/hadoop/hbase/client/MetaScanner.java:30:in `metaScan' from org/apache/hadoop/hbase/client/HConnectionManager.java:311:in `listTables' from org/apache/hadoop/hbase/client/HBaseAdmin.java:122:in `listTables' from sun/reflect/NativeMethodAccessorImpl.java:-2:in `invoke0' from sun/reflect/NativeMethodAccessorImpl.java:39:in `invoke' from sun/reflect/DelegatingMethodAccessorImpl.java:25:in `invoke' from java/lang/reflect/Method.java:597:in `invoke' from org/jruby/javasupport/JavaMethod.java:250:in `invokeWithExceptionHandling' from org/jruby/javasupport/JavaMethod.java:219:in `invoke' from org/jruby/javasupport/JavaClass.java:416:in `execute' from org/jruby/internal/runtime/methods/SimpleCallbackMethod.java:67:in `call' from org/jruby/internal/runtime/methods/DynamicMethod.java:70:in `call' from org/jruby/runtime/CallSite.java:123:in `cacheAndCall' from org/jruby/runtime/CallSite.java:298:in `call' ... 130 levels... from ruby.usr.local.hbase_minus_0_dot_19_dot_0.bin.hirbInvokermethod__32$RUBY$startOpt:-1:in `call' from org/jruby/internal/runtime/methods/DynamicMethod.java:74:in `call' from org/jruby/internal/runtime/methods/CompiledMethod.java:48:in `call' from org/jruby/runtime/CallSite.java:123:in `cacheAndCall' from org/jruby/runtime/CallSite.java:298:in `call' from ruby/usr/local/hbase_minus_0_dot_19_dot_0/bin//usr/local/hbase-0.19.0/bin/../bin/hirb.rb:429:in `_ file _' from ruby/usr/local/hbase_minus_0_dot_19_dot_0/bin//usr/local/hbase-0.19.0/bin/../bin/hirb.rb:-1:in `_ file _' from ruby/usr/local/hbase_minus_0_dot_19_dot_0/bin//usr/local/hbase-0.19.0/bin/../bin/hirb.rb:-1:in `load' from org/jruby/Ruby.java:512:in `runScript' from org/jruby/Ruby.java:432:in `runNormally' from org/jruby/Ruby.java:312:in `runFromMain' from org/jruby/Main.java:144:in `run' from org/jruby/Main.java:89:in `run' from org/jruby/Main.java:80:in `main' from /usr/local/hbase-0.19.0/bin/../bin/hirb.rb:288:in `list'
          Hide
          stack added a comment -

          I committed the above two scripts though 10 lines of duplication. I went about adding new lib/ruby dir that both require but its a pain making sure we don't overlap imports – get messy warnings if duplicated – so I need to do more study first.

          Show
          stack added a comment - I committed the above two scripts though 10 lines of duplication. I went about adding new lib/ruby dir that both require but its a pain making sure we don't overlap imports – get messy warnings if duplicated – so I need to do more study first.
          Hide
          stack added a comment -

          Here's a script to copy a table in hbase. I've been using it here at pset to make copies of tables to get around deletes made before HBASE-826 went in.

          If we were to commit these scripts, need to refactor both since they have a bunch in common; we should start a ruby lib subdir under hbase lib into which we put common script utility.

          Show
          stack added a comment - Here's a script to copy a table in hbase. I've been using it here at pset to make copies of tables to get around deletes made before HBASE-826 went in. If we were to commit these scripts, need to refactor both since they have a bunch in common; we should start a ruby lib subdir under hbase lib into which we put common script utility.
          Hide
          stack added a comment -

          Deleted old broken version.

          Show
          stack added a comment - Deleted old broken version.
          Hide
          stack added a comment -

          Previous version put all tables into one. This new version does the right thing.

          Show
          stack added a comment - Previous version put all tables into one. This new version does the right thing.
          Hide
          stack added a comment -

          Tested against table of 40 regions. Table was millions of rows, new and so 'clean'. Script worked without issue.

          Show
          stack added a comment - Tested against table of 40 regions. Table was millions of rows, new and so 'clean'. Script worked without issue.
          Hide
          stack added a comment -

          Script to rename a table; needs some more testing.

          Show
          stack added a comment - Script to rename a table; needs some more testing.
          Hide
          Franco Salvetti added a comment -

          in alternative:

          • overwrite a table
          • remove a table

          something which basically support the use case of:

          load, reload, reload, ..., reload, ... I am happy ... keep it

          Show
          Franco Salvetti added a comment - in alternative: overwrite a table remove a table something which basically support the use case of: load, reload, reload, ..., reload, ... I am happy ... keep it
          Hide
          Bryan Duxbury added a comment -

          Agree, we should have this. +1

          Show
          Bryan Duxbury added a comment - Agree, we should have this. +1
          Hide
          Edward J. Yoon added a comment -

          +1

          Show
          Edward J. Yoon added a comment - +1

            People

            • Assignee:
              Unassigned
              Reporter:
              Michael Bieniosek
            • Votes:
              2 Vote for this issue
              Watchers:
              12 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:

                Development