Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-3265

Allow using custom ipython executable with pyspark

Attach filesAttach ScreenshotVotersWatch issueWatchersCreate sub-taskLinkCloneUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 1.0.2, 1.1.0
    • 1.2.0
    • PySpark
    • None

    Description

      Although you can make pyspark use ipython with IPYTHON=1, and also change the python executable with PYSPARK_PYTHON=..., you can't use both at the same time because it hardcodes the default ipython script.

      This makes it use the PYSPARK_PYTHON variable if present and fall back to default python, similarly to how the default python executable is handled.

      So you can use a custom ipython like so:
      PYSPARK_PYTHON=./anaconda/bin/ipython IPYTHON_OPTS="notebook" pyspark

      Attachments

        Activity

          This comment will be Viewable by All Users Viewable by All Users
          Cancel

          People

            robodwyer Rob O'Dwyer
            robodwyer Rob O'Dwyer
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Slack

                Issue deployment