Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-24477

Import submodules under pyspark.ml by default

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.4.0
    • 2.4.0
    • ML, PySpark
    • None

    Description

      Right now, we do not import submodules under pyspark.ml by default. So users cannot do

      from pyspark import ml
      kmeans = ml.clustering.KMeans(...)
      

      I create this JIRA to discuss if we should import the submodules by default. It will change behavior of

      from pyspark.ml import *
      

      But it simplifies unnecessary imports.

      cc hyukjin.kwon

      Attachments

        Issue Links

          Activity

            People

              gurwls223 Hyukjin Kwon
              mengxr Xiangrui Meng
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: