Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-1458

Expose sc.version in PySpark

    XMLWordPrintableJSON

Details

    • New Feature
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 0.9.0
    • 1.1.0
    • PySpark, Spark Core
    • None

    Description

      As discussed here, I think it would be nice if there was a way to programmatically determine what version of Spark you are running.

      The potential use cases are not that important, but they include:

      1. Branching your code based on what version of Spark is running.
      2. Checking your version without having to quit and restart the Spark shell.

      Right now in PySpark, I believe the only way to determine your version is by firing up the Spark shell and looking at the startup banner.

      Attachments

        Activity

          People

            joshrosen Josh Rosen
            nchammas Nicholas Chammas
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: