Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-21840

Allow multiple SparkSubmit invocations in same JVM without polluting system properties

    Details

    • Type: New Feature
    • Status: Resolved
    • Priority: Minor
    • Resolution: Fixed
    • Affects Version/s: 2.3.0
    • Fix Version/s: 2.3.0
    • Component/s: Spark Core
    • Labels:
      None

      Description

      Filing this as a sub-task of SPARK-11035; this feature was discussed as part of the PR currently attached to that bug.

      Basically, to allow the launcher library to run applications in-process, the easiest way is for it to run the SparkSubmit class. But that class currently propagates configuration to applications by modifying system properties.

      That means that when launching multiple applications in that manner in the same JVM, the configuration of the first application may leak into the second application (or to any other invocation of `new SparkConf()` for that matter).

      This feature is about breaking out the fix for this particular issue from the PR linked to SPARK-11035. With the changes in SPARK-21728, the implementation can even be further enhanced by providing an actual SparkConf instance to the application, instead of opaque maps.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                vanzin Marcelo Vanzin
                Reporter:
                vanzin Marcelo Vanzin
              • Votes:
                0 Vote for this issue
                Watchers:
                6 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: