Details
-
Improvement
-
Status: Resolved
-
Minor
-
Resolution: Won't Fix
-
None
-
None
-
None
Description
Currently, the training api has signature like LogisticRegressionWithSGD.
If we want to use another optimizer, we've two options, either adding new api like LogisticRegressionWithNewOptimizer which causes 99% of the code duplication, or we can re-factorize the api to take the optimizer as an option like the following.
class LogisticRegression private (
var optimizer: Optimizer)
extends GeneralizedLinearAlgorithm[LogisticRegressionModel]
Attachments
Issue Links
- relates to
-
SPARK-1856 Standardize MLlib interfaces
- Resolved
-
SPARK-5256 Improving MLlib optimization APIs
- Resolved
-
SPARK-18303 CLONE - Improving MLlib optimization APIs
- Resolved