Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-22554

Add a config to control if PySpark should use daemon or not

    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: Resolved
    • Priority: Trivial
    • Resolution: Fixed
    • Affects Version/s: 2.3.0
    • Fix Version/s: 2.3.0
    • Component/s: PySpark
    • Labels:
      None

      Description

      Actually, SparkR already has a flag for useDaemon:

      https://github.com/apache/spark/blob/478fbc866fbfdb4439788583281863ecea14e8af/core/src/main/scala/org/apache/spark/api/r/RRunner.scala#L362

      It'd be great if we have this flag too. It makes easier to test Windows specific issue.

      Also, this is also partly for running Python coverage without extra code change. I know a hacky way to run this:

      https://github.com/apache/spark/pull/19630#issuecomment-345490662

        Attachments

          Activity

            People

            • Assignee:
              hyukjin.kwon Hyukjin Kwon
              Reporter:
              hyukjin.kwon Hyukjin Kwon
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: