Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-22406

pyspark version tag is wrong on PyPi

Attach filesAttach ScreenshotVotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 2.2.0
    • 2.1.2, 2.2.1
    • PySpark
    • None

    Description

      On pypi.python.org, the pyspark package is tagged with version 2.2.0.post0: https://pypi.python.org/pypi/pyspark/2.2.0

      However, when you install the package, it has version 2.2.0.

      This has really annoying consequences: if you try pip install pyspark==2.2.0, it won't work. Instead you have to do pip install pyspark==2.2.0.post0. Then, if you later run the same command (pip install pyspark==2.2.0.post0), it won't recognize the existing pyspark installation (because it has version 2.2.0) and instead will reinstall it, which is very slow because pyspark is a large package.

      This can happen if you add a new package to a requirements.txt file; you end up waiting a lot longer than necessary because every time you run pip install -r requirements.txt it reinstalls pyspark.

      Can you please change the package on PyPi to have the version 2.2.0?

      Attachments

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            holden Holden Karau
            kerrick-lyft Kerrick Staley
            Holden Karau Holden Karau
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment