Uploaded image for project: 'HBase'
  1. HBase
  2. HBASE-16873

WAL: SequenceId assign with less friction

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Open
    • Major
    • Resolution: Unresolved
    • None
    • None
    • Performance, wal
    • None

    Description

      This is an issue to improve our current sequence id assignment. It has become complex with lots of friction.

      In the old days, a simple notion that the single consumer thread pulling from the ringbuffer should assign all sequenceids seemed to make sense. It probably had provenance in the old days when there was a single sequenceid for a regionserver but seemed like a fine choice even after the move to region-scoped sequenceids – rather than regionserver scopce – and then beyond that, when region-scoped sequenceids were unified with mvcc. The rationale ran, a single thread appending to the WAL can run without locks and this single thread being the arbiter of order, seemed like the natural owner of the sequenceid increment.

      Along comes large-scale production deploy, HBASE-16698. It highlights an oversight in the above reasoning; i.e. that the single RB consumer thread must pass a synchronize block per region to do the sequence id update and the spread between the call to append and actual assign of the sequence id on other side of the RB is forcing a severe serialization when there is opportunity for parallellism.

      This issue is about taking this finding and doing better than the expedient fix done on HBASE-16698. Can we do without the lock on the region getting the sequenceid as we call append? Can we exploit the fact that the ringbuffer txid is always incrementing as is the region mvcc/sequenceid? Can we use this fact to do region sequenceid w/o taking a lock?

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              stack Michael Stack
              Votes:
              0 Vote for this issue
              Watchers:
              18 Start watching this issue

              Dates

                Created:
                Updated: