Details
-
New Feature
-
Status: Resolved
-
Major
-
Resolution: Incomplete
-
None
-
None
Description
Nesterov's accelerated first-order method is a drop-in replacement for steepest descent but it converges much faster. We should implement this method and compare its performance with existing algorithms, including SGD and L-BFGS.
TFOCS (http://cvxr.com/tfocs/) is a reference implementation of Nesterov's method and its variants on composite objectives.
Attachments
Attachments
Issue Links
- is part of
-
SPARK-6346 Use faster converging optimization method in MLlib
- Resolved
- relates to
-
SPARK-3942 LogisticRegressionWithLBFGS should not use SquaredL2Updater
- Resolved
-
SPARK-3382 GradientDescent convergence tolerance
- Resolved
- links to