Uploaded image for project: 'Apache MADlib'
  1. Apache MADlib
  2. MADLIB-1206

Add mini batch based gradient descent support to MLP

    XMLWordPrintableJSON

Details

    • New Feature
    • Status: Closed
    • Major
    • Resolution: Fixed
    • None
    • v1.14
    • None

    Description

      Mini-batch gradient descent is typically the algorithm of choice when training a neural network.

      MADlib currently supports IGD, we may have to add extensions to include mini-batch as a solver for MLP. Other modules will continue to use the existing IGD that does not support mini-batching. Later JIRAs will move other modules over one at a time to use the new mini-batch GD.

      Related JIRA that will pre-process the input data to be consumed by mini-batch isĀ https://issues.apache.org/jira/browse/MADLIB-1200

      Attachments

        Issue Links

          Activity

            People

              riyer Rahul Iyer
              njayaram Nandish Jayaram
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: