Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-20047

Constrained Logistic Regression



    • New Feature
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.2.0
    • 2.2.0
    • MLlib
    • None


      For certain applications, such as stacked regressions, it is important to put non-negative constraints on the regression coefficients. Also, if the ranges of coefficients are known, it makes sense to constrain the coefficient search space.

      Fitting generalized constrained regression models object to Cβ ≤ b, where C ∈ R^{m×p} and b ∈ R^{m} are predefined matrices and vectors which places a set of m linear constraints on the coefficients is very challenging as discussed in many literatures.

      However, for box constraints on the coefficients, the optimization is well solved. For gradient descent, people can projected gradient descent in the primal by zeroing the negative weights at each step. For LBFGS, an extended version of it, LBFGS-B can handle large scale box optimization efficiently. Unfortunately, for OWLQN, there is no good efficient way to do optimization with box constrains.

      As a result, in this work, we only implement constrained LR with box constrains without L1 regularization.

      Note that since we standardize the data in training phase, so the coefficients seen in the optimization subroutine are in the scaled space; as a result, we need to convert the box constrains into scaled space.

      Users will be able to set the lower / upper bounds of each coefficients and intercepts.


        Issue Links



              yanboliang Yanbo Liang
              dbtsai DB Tsai
              0 Vote for this issue
              5 Start watching this issue