Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-47565

PySpark workers dying in daemon mode idle queue fail query

    XMLWordPrintableJSON

Details

    • Improvement
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.4.2, 3.5.1, 3.3.4
    • 4.0.0
    • PySpark

    Description

      PySpark workers may die after entering the idle queue in `PythonWorkerFactory`. This may happen because of code that runs in the process, or external factors.

      When drawn from the warmpool, such a worker will result in an I/O exception on the first read/write .

      Attachments

        Activity

          People

            awsthni Nikita Awasthi
            bastih-db Sebastian Hillig
            Hyukjin Kwon Hyukjin Kwon
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: