Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-3162

Train DecisionTree locally when possible

    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: In Progress
    • Priority: Critical
    • Resolution: Unresolved
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: ML
    • Labels:
      None

      Description

      Improvement: communication

      Currently, every level of a DecisionTree is trained in a distributed manner. However, at deeper levels in the tree, it is possible that a small set of training data will be matched with any given node. If the node’s training data can fit on one machine’s memory, it may be more efficient to shuffle the data and do local training for the rest of the subtree rooted at that node.

      Note: It is possible that local training would become possible at different levels in different branches of the tree. There are multiple options for handling this case:
      (1) Train in a distributed fashion until all remaining nodes can be trained locally. This would entail training multiple levels at once (locally).
      (2) Train branches locally when possible, and interleave this with distributed training of the other branches.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                Unassigned
                Reporter:
                josephkb Joseph K. Bradley
              • Votes:
                0 Vote for this issue
                Watchers:
                13 Start watching this issue

                Dates

                • Created:
                  Updated: