Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-18755

Add Randomized Grid Search to Spark ML

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Incomplete
    • None
    • None
    • ML

    Description

      Randomized Grid Search implements a randomized search over parameters, where each setting is sampled from a distribution over possible parameter values. This has two main benefits over an exhaustive search:
      1. A budget can be chosen independent of the number of parameters and possible values.
      2. Adding parameters that do not influence the performance does not decrease efficiency.

      Randomized Grid search usually gives similar result as exhaustive search, while the run time for randomized search is drastically lower.

      For more background, please refer to:

      sklearn: http://scikit-learn.org/stable/modules/grid_search.html
      http://blog.kaggle.com/2015/07/16/scikit-learn-video-8-efficiently-searching-for-optimal-tuning-parameters/
      http://www.jmlr.org/papers/volume13/bergstra12a/bergstra12a.pdf
      https://www.r-bloggers.com/hyperparameter-optimization-in-h2o-grid-search-random-search-and-the-future/.

      There're two ways to implement this in Spark as I see:
      1. Add searchRatio to ParamGridBuilder and conduct sampling directly during build. Only 1 new public function is required.
      2. Add trait RadomizedSearch and create new class RandomizedCrossValidator and RandomizedTrainValiationSplit, which can be complicated since we need to deal with the models.

      I'd prefer option 1 as it's much simpler and straightforward. We can support Randomized grid search via some smallest change.

      Attachments

        Activity

          People

            Unassigned Unassigned
            yuhaoyan yuhao yang
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: