Details
-
Improvement
-
Status: Closed
-
Major
-
Resolution: Won't Do
-
None
-
None
Description
Stochastic gradient descent (SGD) is a widely used optimization technique in different ML algorithms. Thus, it would be helpful to provide a generalized SGD implementation which can be instantiated with the respective gradient computation. Such a building block would make the development of future algorithms easier.
Attachments
Issue Links
- is blocked by
-
FLINK-1901 Create sample operator for Dataset
- Resolved
- Is contained by
-
FLINK-1889 Create optimization framework
- Closed
- is related to
-
FLINK-1979 Implement Loss Functions
- Closed
- is required by
-
FLINK-1992 Add convergence criterion to SGD optimizer
- Closed
-
FLINK-1993 Replace MultipleLinearRegression's custom SGD with optimization framework's SGD
- Closed
-
FLINK-1994 Add different gain calculation schemes to SGD
- Closed
- requires
-
FLINK-2396 Review the datasets of dynamic path and static path in iteration.
- Open