Description
There might be a error in incubator-spark/mllib/src/main/scala/org/apache/spark/mllib/optimizatio/Gradient.scala
The loss function of class LogisticGradient might be wrong.
The original one is :
val loss =
if (margin > 0)
else
{ math.log(1 + math.exp(margin)) - margin }But when we use this kind of loss function, we will find that the loss is increasing when optimizing, such as LogisticRegressionWithSGD.
I think it should be something like this:
val loss =
if (label > 0)
else
{ math.log(1 + math.exp(margin)) - margin }I tested the loss function. It works well.