Details

    • Reviewed

    Description

      TableSkewCostFunction uses the sum of the max deviation region per server for all tables as the measure of unevenness. It doesn't work in a very common scenario in operations. Say we have 100 regions on 50 nodes, two on each. We add 50 new nodes and they have 0 each. The max deviation from the mean is 1, compared to 99 in the worst case scenario of 100 regions on a single server. The normalized cost is 1/99 = 0.011 < default threshold of 0.05. Balancer wouldn't move.  The proposal is to use aggregated deviation of the count per region server to detect this scenario, generating a cost of 100/198 = 0.5 in this case.

      Attachments

        Activity

          People

            clarax98007 Clara Xiong
            claraxiong Clara Xiong
            Votes:
            0 Vote for this issue
            Watchers:
            9 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: