Details
-
New Feature
-
Status: Closed
-
Major
-
Resolution: Won't Do
-
None
-
None
Description
Hyperparameter optimization is a suite of techniques that are used to find the best hyperparameters for a machine learning model, in respect to the performance on an independent (test) dataset.
The most common way it is implemented is by using cross-validation to estimate the model performance on the test set, and using grid search as the strategy to try out different parameters.
In the future we would like to support random search and Bayesian optimisation.
Attachments
Issue Links
- is required by
-
FLINK-2260 Have a complete model evaluation and selection framework for FlinkML
- Closed
- requires
-
FLINK-1723 Add cross validation for model evaluation
- Closed