Details
-
Bug
-
Status: Closed
-
Minor
-
Resolution: Duplicate
-
1.5.2
-
None
-
None
Description
The `extractParamMap()` method for a model that has been fit returns an empty dictionary, e.g. (from the [Pyspark ML API Documentation](http://spark.apache.org/docs/latest/ml-guide.html#example-estimator-transformer-and-param)):
```python
from pyspark.mllib.linalg import Vectors
from pyspark.ml.classification import LogisticRegression
from pyspark.ml.param import Param, Params
- Prepare training data from a list of (label, features) tuples.
training = sqlContext.createDataFrame([
(1.0, Vectors.dense([0.0, 1.1, 0.1])),
(0.0, Vectors.dense([2.0, 1.0, -1.0])),
(0.0, Vectors.dense([2.0, 1.3, 1.0])),
(1.0, Vectors.dense([0.0, 1.2, -0.5]))], ["label", "features"])
- Create a LogisticRegression instance. This instance is an Estimator.
lr = LogisticRegression(maxIter=10, regParam=0.01) - Print out the parameters, documentation, and any default values.
print "LogisticRegression parameters:\n" + lr.explainParams() + "\n"
- Learn a LogisticRegression model. This uses the parameters stored in lr.
model1 = lr.fit(training)
- Since model1 is a Model (i.e., a transformer produced by an Estimator),
- we can view the parameters it used during fit().
- This prints the parameter (name: value) pairs, where names are unique IDs for this
- LogisticRegression instance.
print "Model 1 was fit using parameters: "
print model1.extractParamMap()
```
Attachments
Issue Links
- duplicates
-
SPARK-10931 PySpark ML Models should contain Param values
- Resolved
- Is contained by
-
SPARK-14771 Python ML Param and UID issues
- Resolved
- links to