Details

    • Type: Sub-task
    • Status: In Progress
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: Balancer, master
    • Labels:
      None

      Description

      TableSkewCostFunction uses the sum of the max deviation region per server for all tables as the measure of unevenness. It doesn't work in a very common scenario in operations. Say we have 100 regions on 50 nodes, two on each. We add 50 new nodes and they have 0 each. The max deviation from the mean is 1, compared to 99 in the worst case scenario of 100 regions on a single server. The normalized cost is 1/99 = 0.011 < default threshold of 0.05. Balancer wouldn't move.  The proposal is to use aggregated deviation of the count per region server to detect this scenario, generating a cost of 100/198 = 0.5 in this case.

        Attachments

          Activity

            People

            • Assignee:
              clarax98007 Clara Xiong
              Reporter:
              claraxiong Clara Xiong
            • Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

              Dates

              • Created:
                Updated: