Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-1808

bin/pyspark does not load default configuration properties

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 1.0.0
    • 1.0.0
    • None
    • None

    Description

      ... because it doesn't go through spark-submit. Either we make it go through spark-submit (hard), or we extract the load default configurations logic and set them for the JVM that launches the py4j GatewayServer (easier).

      Right now, the only way to set config values for bin/pyspark is to do it through SPARK_JAVA_OPTS in spark-env.sh, which is supposedly deprecated.

      Attachments

        Activity

          People

            andrewor14 Andrew Or
            andrewor14 Andrew Or
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: